Shannon Seminar Room, Place du Levant 3, Maxwell Building, 1st floor -- Wednesday, 03 April 2019 at 11:00 (45 min.)
{
"name":"Compressive Learning meets privacy",
"description":"Compressive Learning (CL) is a framework where a target learning task (e.g., clustering or density fitting) is not performed on the whole dataset of signals, but on a heavily compressed representation of it (called sketch), enabling training with reduced time and memory resources. Because the sketch only keeps track of general tendencies (i.e., generalized moments) of the dataset while discarding individual data records, previous work argued that CL should protect the privacy of the users that contributed to the dataset, but without providing formal arguments to back up this claim. This work aims to formalize this observation.",
"startDate":"2019-04-03",
"endDate":"2019-04-03",
"startTime":"11:00",
"endTime":"11:45",
"location":"Shannon Seminar Room, Place du Levant 3, Maxwell Building, 1st floor",
"label":"Add to my Calendar",
"options":[
"Apple",
"Google",
"iCal",
"Microsoft365",
"MicrosoftTeams",
"Outlook.com"
],
"timeZone":"Europe/Berlin",
"trigger":"click",
"inline":true,
"listStyle":"modal",
"iCalFileName":"Seminar-Reminder"
}
Compressive Learning (CL) is a framework where a target learning task (e.g., clustering or density fitting) is not performed on the whole dataset of signals, but on a heavily compressed representation of it (called sketch), enabling training with reduced time and memory resources. Because the sketch only keeps track of general tendencies (i.e., generalized moments) of the dataset while discarding individual data records, previous work argued that CL should protect the privacy of the users that contributed to the dataset, but without providing formal arguments to back up this claim. This work aims to formalize this observation.