Shannon seminar Room (Place du Levant 3, Maxwell Building, 1st floor) -- Wednesday, 04 March 2020 at 11:00 (45 min.)
{
"name":"Using Approximate Computing to Improve the Efficiency of LSTM Neural Networks",
"description":"As the growing field of Artificial Neural Networks, Recurrent Neural Networks are often used for sequence-related applications. Long Short-Term Memory (LSTM) neural networks are improved and widely used versions of Recurrent Neural Networks. To achieve high accuracy, researchers always build large-scale LSTM networks which are time-consuming and power-consuming.",
"startDate":"2020-03-04",
"endDate":"2020-03-04",
"startTime":"11:00",
"endTime":"11:45",
"location":"Shannon seminar Room (Place du Levant 3, Maxwell Building, 1st floor)",
"label":"Add to my Calendar",
"options":[
"Apple",
"Google",
"iCal",
"Microsoft365",
"MicrosoftTeams",
"Outlook.com"
],
"timeZone":"Europe/Berlin",
"trigger":"click",
"inline":true,
"listStyle":"modal",
"iCalFileName":"Seminar-Reminder"
}
As the growing field of Artificial Neural Networks, Recurrent Neural Networks are often used for sequence-related applications. Long Short-Term Memory (LSTM) neural networks are improved and widely used versions of Recurrent Neural Networks. To achieve high accuracy, researchers always build large-scale LSTM networks which are time-consuming and power-consuming.