280
Warden, Pete, “Software That Can See Will Change Privacy Forever”, MIT Technology Review, July 29, 2014, http://www.technologyreview.com/view/529396/software-that-can-see-will-change-privacy-forever.
281
Ekman, Paul, and Wallace V. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement (Palo Alto: Consulting Psychologists Press, 1978).
282
См., например, Fasel, B., and Juergen Luettin, “Automatic Facial Expression Analysis: A Survey”, Pattern Recognition 36, no. 1 (January 2003), pp. 259–275, http://www.sciencedirect.com/science/article/pii/S0031320302000523; and Bartlett, Marian Stewart, Gwen Littlewort-Ford, Javier Movellan, Ian Fasel, and Mark Frank, “Automated Facial Action Coding System”, US Patent no. US8798374 B2, August 26, 2009, https://www.google.com/patents/US8798374.
283
Ekman, Paul, Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life (New York: Times Books, 2003), pp. 3–8.
284
Там же, p. 220.
285
Используется разрешение 4K, применяемое в электронных устройствах массового потребления. Интервью автора с Хавиером Р. Мовелланом – сооснователем и ведущим научным сотрудником Emotient от 14 ноября 2015.
286
Dwoskin, Elizabeth, and Evelyn M. Rusli, “The Technology That Unmasks Your Hidden Emotions”, Wall Street Journal, January 28, 2015, http://www.wsj.com/articles/startups-see-your-face-unmask-your-emotions-1422472398. Видео фальшивых мучений от боли, использовавшееся для обучения модели – на “Are These People in Real Pain or Just Faking It?”, New York Times, April 28, 2014, http://www.nytimes.com/interactive/2014/04/28/science /faking-pain.html.
287
Truding, Alice, “This Google Glass App Will Detect Your Emotions, Then Relay Them Back to Retailers”, Fast Company, March 6, 2014, http://www.fastcompany.com/3027342 /fast-feed/this-google-glass-app-will-detect-your-emotions-then-relay-them-back-to-retailers.
288
Kokalitcheva, Kia, “Apple Acquires Startup That Reads Emotions From Facial Expressions”, Fortune, January 7, 2016, http://fortune.com/2016/01/07/apple-emotient-acquisition.
289
“Does My Ad Evoke the Emotions I Want It To? LG, ‘Stage Fright,’” Realeyes, n.d., http://www.realeyesit.com/case-study-lg. LG’s finished ad can be seen at https://www.youtube.com/watch?v=Yf636vLep8s.NOTES TO CHAPTER XXX 261
290
Picard, Rosalind W., “Future Affective Technology for Autism and Emotion Communication”, Philosophical Transactions for the Royal Society of London Series B, Biological Sciences 364, no. 1535 (December 2009), pp. 3575–3584.
291
Bosker, Bianca, “Affectiva’s Emotion Recognition Tech: When Machines Know What You’re Feeling”, Huffington Post, December 24, 2012, http://www.huffingtonpost.com/2012/12/24/affectiva-emotion-recognition-technology_n_2360136.html.
292
Ученые установили, что использование в системах распознавания речи и эмоций коэффициентов косинусного преобразования Фурье для частот чистых тонов (математического описания звуковых частот) позволяет достичь большей точности по сравнению с обычными измерениями тонов. Термин «тон» используется мной для простоты.
293
Для массива «Звуковая окраска девятнадцати эмоций в различных культурах» актеров просили выразить три уровня интенсивности девятнадцати эмоций: привязанности, веселья, гнева, презрения, отвращения, смятения, страха, вины, счастья, интереса, вожделения, неприятного удивления, безразличия, приятного удивления, гордости, облегчения, грусти, покоя и стыда. См. See Laukka, Petri, Hillary Anger Elfenbein, Wanda Chui, and Nutankumar S. Thingujam, “Presenting the VENEC Corpus: Development of a Cross-Cultural Corpus of Vocal Emotional Expressions and a Novel Method of Annotating Emotion Appraisals”, Proceedings of the LREC 2010 Workshop on Corpora for Research on Emotion and Effect (Malta: European Language Resources Association, 2010), pp. 53–57, http://www.diva-portal.org/smash/record.jsf?pid=diva2%3A373848&dswid=760.
294
Neiberg, Daniel, and Joakim Gustafson, “Cues to Perceived Functions of Acted and Spontaneous Feedback Expressions”, Proceedings of the Interdisciplinary Workshop on Feedback Behaviors in Dialog, September 7–8, 2012, Stevenson, WA, pp. 53–56, http://www.cs.utep.edu/nigel/feedback/proceedings/full-proceedings.pdf.
295
Lutfi, Syaheerah Lebai, Fernando Fernández-Martínez, Juan Manuel Lucas-Cuesta, Lorena López-Lebón, and Juan Manuel Montero, “A Satisfaction-Based Model for Affect Recognition from Conversational Features in Spoken Dialog Systems”, Speech Communication 55, nos. 7–8 (September 2013), pp. 825–840, http://www.researchgate.net/publication/257012012_A_satisfaction-based_model_for_affect_recognition_from_conversational_features_in_spoken_dialog_systems.
296
Lunden, Ingrid, “LiveOps Raises Another $30M, Acquires UserEvents to Expand Its Cloud Contact Center Platform”, TechCrunch, January 27, 2014, http://techcrunch.com/2014/01/27/liveops-raises-another-30m-acquires-userevents-to-expand-its-cloud-contact-center-platform-with-routing.
297
Bertolucci, Jeff, “Big Data: Matching Personalities in the Call Center”, InformationWeek, February 17, 2015, http://www.informationweek.com/big-data/big-data-analytics/big-data-matching-personalities-in-the-call-center/d/d-id/1319108.
298
Thrun, Sebastian, “From Self-Driving Cars to Retraining People”, Next: Economy O’Reilly Summit: What’s the Future of Work? November 12, 2015, San Francisco, CA, http://conferences.oreilly.com/next-economy/public/schedule/detail/44930.
299
Kanevsky, Dimitri, “IBM 5 in 5: Hearing”, IBM Research News, December 17, 2012, http://ibmresearchnews.blogspot.co.uk/2012/12/ibm-5-in-5–2012-hearing.html.
300
Valenza, Gaetano, Luca Citi, Antonia Lanatá, Enzo Pasquale Scilingo, and Riccardo Barbieri, “Revealing Real-Time Emotional Responses: A Personalized Assessment Based on Heartbeat Dynamics”, Nature Scientific Reports 4, no. 4998 (May 21, 2014), http://www.nature.com/articles/srep04998.
301
В Xbox также есть стандартная камера, анализирующая физические движения, а возможность анализа выражения лица либо уже предусмотрена, либо будет в ближайшее время. См. Wortham, Jenna, “If Our Gadgets Could Measure Our Emotions”, New York Times, June 1, 2013, http://www.nytimes.com/2013/06/02/technology/if-our-gadgets-could-measure-our-emotions.html.
302
Kellner, Tomas, “Meet the Fearbit: New Sweat Sensors Will Sniff Out Fatigue, Stress, and Even Fear”, GE Reports, August 12, 2014, http://www.gereports.com/post /93990980310/meet-the-fearbit-new-sweat-sensors-will-sniff-out.
303
Turner, Matthew A., Stephan Bandelow, L. Edwards, P. Patel, Helen J. Martin, Ian D. Wilson, and Charles L. Paul Thomas, “The Effect of Paced Auditory Serial Addition Test (PASAT) Intervention on the Profile of Volatile Organic Compounds in Human Breath: A Pilot Study”, Journal of Breath Research 7, no. 1 (February 27, 2013), http://iopscience.iop.org/article/10.1088/1752–7155/7/1/017102/meta.
304
Hernandez, Javier, Xavier Benavides, Patti Maes, Daniel McDuff, Judith Amores, and Rosalind M. Picard, “AutoEmotive: Bringing Empathy to the Driving Experience to Manage Stress”, paper presented at the ACM Conference on Designing Interactive Systems, June 21–25, 2014, Vancouver, BC, Canada, http://affect.media.mit.edu/pdfs/14. Hernandez_et_al-DIS.pdf; Hernandez, Javier, Judith Amores, Daniel McDuff, and Xavier Benavides, “AutoEmotive”, MIT Affective Media Lab presentation at the VW Data Driven Hackathon, January 21, 2014, http://autoemotive.media.mit.edu/#about.
305
Обзор дискуссии вокруг психологии эмоций, см. Beck, Julie, “Hard Feelings: Science’s Struggle to Define Emotions”, Atlantic, February 24, 2015, http://www.theatlantic.com/features/archive/2015/02/hard-feelings-sciences-struggle-to-define-emotions/385711.
306
Barrett, Lisa Feldman, Batja Mesquita, Kevin N. Ochsner, and James J. Gross, “The Experience of Emotion”, Annual Review of Psychology 58 (January 2007), pp. 373–403, http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1934613.
307
Ekman, Emotions Revealed, p. 57.
308
Charlton, Alistair, “Future Supermarket Will Track Shoppers’ Eye Movements”, International Business Times, May 1, 2013, http://www.ibtimes.co.uk/future-supermarket-adverts-track-eye-gaze-sideways-463288.
309
Kahneman, Daniel, and Jackson Beatty, “Pupil Diameter and Load on Memory”, Science 154, no. 3756 (December 23, 1966), pp. 158–155.
310
Engbert, Ralf, and Reinhold Kliegl, “Microsaccades Uncover the Orientation of Covert Attention”, Vision Research 43, no. 9 (April 2003), pp. 1035–1045, http://www.sciencedirect.com/science/article/pii/S0042698903000841.
311
Marks, Paul, “Fitbit for the Mind: Eye-Tracker Watches Your Reading”, New Scientist, February 12, 2014, https://www.newscientist.com/article/mg22129563–700-fitbit-for-the-mind-eye-tracker-watches-your-reading.
312
Bulling, Andreas, Daniel Roggen, and Gerhard Tröster, “What’s in the Eyes for Context-Awareness?”, IEEE Pervasive Computing 10, no. 2 (April – June 2011), pp. 48–57, https://perceptual.mpi-inf.mpg.de/files/2013/03/bulling11_pcm.pdf.
313
Беседа автора с Уилки Вонгом – директором по сервисам знаний Tobii Technology, 16 декабря 2015 года.
314
Najar, Amir Shareghi, Antonija Mitrovic, and Kourosh Neshatian, “Eye Tracking and Studying Examples: How Novices and Advanced Learners Study SQL Examples”, Journal of Computing and Information Technology 23, no. 2 (2015),