{"id":19681,"date":"2018-11-07T17:19:42","date_gmt":"2018-11-07T22:19:42","guid":{"rendered":"https:\/\/itp.nyu.edu\/opportunities\/?p=19681"},"modified":"2018-11-07T17:19:42","modified_gmt":"2018-11-07T22:19:42","slug":"call-papers-for-the-second-workshop-on-intelligent-music-interfaces","status":"publish","type":"post","link":"https:\/\/itp.nyu.edu\/opportunities\/2018\/11\/07\/call-papers-for-the-second-workshop-on-intelligent-music-interfaces\/","title":{"rendered":"CALL: Papers for the Second Workshop on Intelligent Music Interfaces"},"content":{"rendered":"<p><strong>Call for Papers<\/strong><\/p>\n<p><strong>2nd Workshop on Intelligent Music Interfaces for Listening and Creation (MILC)<\/strong> held on March 20th, 2019 in conjunction with the\u00a0 24th ACM International Conference on Intelligent User Interfaces Los Angeles, CA, USA (<a href=\"https:\/\/urldefense.proofpoint.com\/v2\/url?u=http-3A__iui.acm.org_2019_&amp;d=DwIFaQ&amp;c=slrrB7dE8n7gBJbeO0g-IQ&amp;r=6mNXpvBEZ0Ligfr5AXVHG1dO_LIz7ct6h-SHXJgcgD8&amp;m=v2QmRHI4KtVAfzTQ4mm84P5JcWKudL9i8JB3EzacyH8&amp;s=N3x4m8r48MdLjx69hdTZulvxGAuKkgP6bOLk2MnJVIc&amp;e=\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/urldefense.proofpoint.com\/v2\/url?u%3Dhttp-3A__iui.acm.org_2019_%26d%3DDwIFaQ%26c%3DslrrB7dE8n7gBJbeO0g-IQ%26r%3D6mNXpvBEZ0Ligfr5AXVHG1dO_LIz7ct6h-SHXJgcgD8%26m%3Dv2QmRHI4KtVAfzTQ4mm84P5JcWKudL9i8JB3EzacyH8%26s%3DN3x4m8r48MdLjx69hdTZulvxGAuKkgP6bOLk2MnJVIc%26e%3D&amp;source=gmail&amp;ust=1541692863012000&amp;usg=AFQjCNFRaEyEfUevLVndfXrY3AdJKvnXrQ\">https:\/\/urldefense.proofpoint<wbr \/>.com\/v2\/url?u=http-3A__iui.acm<wbr \/>.org_2019_&amp;d=DwIFaQ&amp;c=slrrB7dE<wbr \/>8n7gBJbeO0g-IQ&amp;r=6mNXpvBEZ0Lig<wbr \/>fr5AXVHG1dO_LIz7ct6h-<wbr \/>SHXJgcgD8&amp;m=v2QmRHI4KtVAfzTQ4m<wbr \/>m84P5JcWKudL9i8JB3EzacyH8&amp;s=<wbr \/>N3x4m8r48MdLjx69hdTZulvxGAuKkg<wbr \/>P6bOLk2MnJVIc&amp;e=<\/a>)<\/p>\n<p>Workshop website:\u00a0<a href=\"https:\/\/urldefense.proofpoint.com\/v2\/url?u=https-3A__milc2019.github.io&amp;d=DwIFaQ&amp;c=slrrB7dE8n7gBJbeO0g-IQ&amp;r=6mNXpvBEZ0Ligfr5AXVHG1dO_LIz7ct6h-SHXJgcgD8&amp;m=v2QmRHI4KtVAfzTQ4mm84P5JcWKudL9i8JB3EzacyH8&amp;s=J-ZoWR3y0u3u7jYnyhM780pV76alGgPl3bzJplXV8ac&amp;e=\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/urldefense.proofpoint.com\/v2\/url?u%3Dhttps-3A__milc2019.github.io%26d%3DDwIFaQ%26c%3DslrrB7dE8n7gBJbeO0g-IQ%26r%3D6mNXpvBEZ0Ligfr5AXVHG1dO_LIz7ct6h-SHXJgcgD8%26m%3Dv2QmRHI4KtVAfzTQ4mm84P5JcWKudL9i8JB3EzacyH8%26s%3DJ-ZoWR3y0u3u7jYnyhM780pV76alGgPl3bzJplXV8ac%26e%3D&amp;source=gmail&amp;ust=1541692863012000&amp;usg=AFQjCNHhhTDRqIhl9eIc9TRtqZ8NT8Mksw\">https:\/\/urldefense.proofpoint.<wbr \/>com\/v2\/url?u=https-3A__milc201<wbr \/>9.github.io&amp;d=DwIFaQ&amp;c=slrrB7d<wbr \/>E8n7gBJbeO0g-IQ&amp;r=6mNXpvBEZ0Li<wbr \/>gfr5AXVHG1dO_LIz7ct6h-SHXJgcgD<wbr \/>8&amp;m=v2QmRHI4KtVAfzTQ4mm84P5JcW<wbr \/>KudL9i8JB3EzacyH8&amp;s=J-ZoWR3y0u<wbr \/>3u7jYnyhM780pV76alGgPl3bzJplXV<wbr \/>8ac&amp;e=<\/a><\/p>\n<p>Today\u2019s music ecosystem is permeated by digital technology \u2014 from recording to production to distribution to consumption. Intelligent technologies and interfaces play a crucial role during all these steps. On the music creation side, tools and interfaces like new sensor-based musical instruments or software like digital audio workstations (DAWs) and sound and sample browsers support creativity. Generative systems can support novice and professional musicians by automatically synthesising new sounds or even new musical material. On the music consumptionside, tools and interfaces such as recommender systems, automatic radio stations, or active listening applications allow users to navigate the virtually endless spaces of music repositories.<\/p>\n<p>Both ends of the music market therefore heavily rely on and benefit from intelligent approaches that enable users to access sound and music in unprecedented manners. This ongoing trend draws from manifold areas such as interactive machine learning, music information retrieval (MIR) \u2014 in particular content-based retrieval systems, recommender systems, human computer interaction, and adaptive systems, to name but a few prominent examples. Following the successful first edition held in Tokyo 2018, the 2nd Workshop on Intelligent Music Interfaces for Listening and Creation (MILC 2019) will bring together researchers from these communities and provide a forum for the latest trends in user-centric machine learning and interfaces for music consumption and creation.<\/p>\n<p>Exemplary Topics of Interest<br \/>\n\u2022 Music and audio search and browsing interfaces<br \/>\n\u2022 Adaptive music user interfaces<br \/>\n\u2022 Music learning interfaces<br \/>\n\u2022 Music recommender systems<br \/>\n\u2022 Gamification in music interfaces<br \/>\n\u2022 Novel visualization paradigms<br \/>\n\u2022 New technologies for human expression, creativity, and embodied interaction<br \/>\n\u2022 Machine learning for new digital musical instruments<br \/>\n\u2022 Gestural interfaces for music creation and listening<br \/>\n\u2022 Accessible music making technologies<br \/>\n\u2022 Intelligent systems for music composition<br \/>\n\u2022 User modeling for personalized music interfaces<\/p>\n<p>Important Dates<br \/>\n\u2022 Deadline for paper submission: December 7th, 2018<br \/>\n\u2022 Acceptance notification for paper submissions: January 14th, 2019<br \/>\n\u2022 Deadline for final copy of accepted papers: February 15th, 2019<br \/>\n\u2022 Workshop date: March 20, 2019<\/p>\n<p>Organizers<br \/>\n\u2022 Peter Knees, TU Wien, Austria<br \/>\n\u2022 Markus Schedl, Johannes Kepler University Linz, Austria<br \/>\n\u2022 Rebecca Fiebrink, Goldsmiths, University of London, UK<\/p>\n<p>Contact:\u00a0<a href=\"mailto:milc2019@easychair.org\" target=\"_blank\" rel=\"noopener\">milc2019@easychair.org<\/a><\/p>\n<p>Submissions<br \/>\nWe solicit three types of submissions in SigCHI format:<br \/>\n\u2022 Full papers (up to 6 pages)<br \/>\n\u2022 Short papers (up to 4 pages)<br \/>\n\u2022 Demo papers (up to 4 pages)<\/p>\n<p>Please submit your paper via EasyChair (<a href=\"https:\/\/urldefense.proofpoint.com\/v2\/url?u=https-3A__easychair.org_conferences_-3Fconf-3Dmilc2019&amp;d=DwIFaQ&amp;c=slrrB7dE8n7gBJbeO0g-IQ&amp;r=6mNXpvBEZ0Ligfr5AXVHG1dO_LIz7ct6h-SHXJgcgD8&amp;m=v2QmRHI4KtVAfzTQ4mm84P5JcWKudL9i8JB3EzacyH8&amp;s=ySwc-13UlJd-ThQhevv-qw8GzfZGTR_wbiMZrfbXh5U&amp;e=\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/urldefense.proofpoint.com\/v2\/url?u%3Dhttps-3A__easychair.org_conferences_-3Fconf-3Dmilc2019%26d%3DDwIFaQ%26c%3DslrrB7dE8n7gBJbeO0g-IQ%26r%3D6mNXpvBEZ0Ligfr5AXVHG1dO_LIz7ct6h-SHXJgcgD8%26m%3Dv2QmRHI4KtVAfzTQ4mm84P5JcWKudL9i8JB3EzacyH8%26s%3DySwc-13UlJd-ThQhevv-qw8GzfZGTR_wbiMZrfbXh5U%26e%3D&amp;source=gmail&amp;ust=1541692863012000&amp;usg=AFQjCNEEul8c_2qe1VsoccRmYt0TZjQlfQ\">https:\/\/urldefense.proofpoint<wbr \/>.com\/v2\/url?u=https-3A__easych<wbr \/>air.org_conferences_-3Fconf-<wbr \/>3Dmilc2019&amp;d=DwIFaQ&amp;c=slrrB7dE<wbr \/>8n7gBJbeO0g-IQ&amp;r=6mNXpvBEZ0Lig<wbr \/>fr5AXVHG1dO_LIz7ct6h-SHXJgcgD8<wbr \/>&amp;m=v2QmRHI4KtVAfzTQ4mm84P5JcWK<wbr \/>udL9i8JB3EzacyH8&amp;s=ySwc-<wbr \/>13UlJd-ThQhevv-qw8GzfZGTR_<wbr \/>wbiMZrfbXh5U&amp;e=<\/a>). Submissions will be reviewed by at least three members of the program committee. One author of each accepted submissions will be required to attend and give a presentation at the workshop.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Call for Papers 2nd Workshop on Intelligent Music Interfaces for Listening and Creation (MILC) held on March 20th, 2019 in conjunction with the\u00a0 24th ACM&#8230;<\/p>\n","protected":false},"author":122,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[],"class_list":["post-19681","post","type-post","status-publish","format-standard","hentry","category-call","entry"],"_links":{"self":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/posts\/19681","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/users\/122"}],"replies":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/comments?post=19681"}],"version-history":[{"count":0,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/posts\/19681\/revisions"}],"wp:attachment":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/media?parent=19681"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/categories?post=19681"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/tags?post=19681"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}