<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://airwiki.deib.polimi.it/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=JulianMauricioAngelFernandez</id>
		<title>AIRWiki - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://airwiki.deib.polimi.it/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=JulianMauricioAngelFernandez"/>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php/Special:Contributions/JulianMauricioAngelFernandez"/>
		<updated>2026-04-05T01:42:00Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.25.6</generator>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17565</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17565"/>
				<updated>2015-03-14T15:33:53Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275 Odroid U3], in which was installed [http://www.ros.org ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to remove of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Connecting with the platform ===&lt;br /&gt;
The platform now has a wifi usb that could be configured to connect to the network. [TODO] explain how to set the network changing network file. From a computer you can have ethernet connection and wifi communication with the platform following the steps from this [http://blog.scottlowe.org/2013/05/29/a-quick-introduction-to-linux-policy-routing/ webpage]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17564</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17564"/>
				<updated>2015-03-14T15:32:37Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275| Odroid U3], in which was installed [http://www.ros.org| ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to remove of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Connecting with the platform ===&lt;br /&gt;
The platform now has a wifi usb that could be configured to connect to the network. [TODO] explain how to set the network changing network file. From a computer you can have ethernet connection and wifi communication with the platform following the steps from this [http://blog.scottlowe.org/2013/05/29/a-quick-introduction-to-linux-policy-routing/| webpage]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17563</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17563"/>
				<updated>2015-03-14T15:31:27Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Connecting with the platform */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275| Odroid U3], in which was installed [http://www.ros.org| ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to remove of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Connecting with the platform ===&lt;br /&gt;
The platform now has a wifi usb that could be configured to connect to the network. [TODO] explain how to set the network changing network file. From a computer you can have ethernet connection and wifi communication with the platform following the steps from this [http://blog.scottlowe.org/2013/05/29/a-quick-introduction-to-linux-policy-routing/|webpage]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17562</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17562"/>
				<updated>2015-03-14T15:30:36Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Connecting with the platform */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275| Odroid U3], in which was installed [http://www.ros.org| ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to remove of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Connecting with the platform ===&lt;br /&gt;
The platform now has a wifi usb that could be configured to connect to the network. [TODO] explain how to set the network changing network file. From a computer you can have ethernet connection and wifi communication with the platform following the steps from this [blog.scottlowe.org/2013/05/29/a-quick-introduction-to-linux-policy-routing/|webpage]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17561</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17561"/>
				<updated>2015-03-14T15:30:01Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275| Odroid U3], in which was installed [http://www.ros.org| ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to remove of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
=== Connecting with the platform ===&lt;br /&gt;
The platform now has a wifi usb that could be configured to connect to the network. [TODO] explain how to set the network changing network file. From a computer you can have ethernet connection and wifi communication with the platform following the steps from this [http://blog.scottlowe.org/2013/05/29/a-quick-introduction-to-linux-policy-routing/|webpage]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17184</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17184"/>
				<updated>2014-11-18T17:45:49Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Robotic Platform */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275| Odroid U3], in which was installed [http://www.ros.org| ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to remove of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17183</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17183"/>
				<updated>2014-11-18T16:41:58Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
A robotic platform has been developed to not have human-like appearance. This platform has suffered different changes through its development. The first version was built using an Arduino Mega, three metal gear motors, and one servo motor attached to a beam, as it is showed in the following image:&lt;br /&gt;
&lt;br /&gt;
[[File:Triskar.JPG|center|250px]]&lt;br /&gt;
&lt;br /&gt;
To the second version, it was added two additional servos. The three servos were used to change the shape of the robot:&lt;br /&gt;
&lt;br /&gt;
[[File:FusionAdavance.png|center|450px]]&lt;br /&gt;
&lt;br /&gt;
It was added an [http://www.hardkernel.com/main/products/prdt_info.php?g_code=g138745696275| Odroid U3], in which was installed [http://www.ros.org| ROS]. The communication between Odroid and Arduino was done through ROS Serial.&lt;br /&gt;
&lt;br /&gt;
New modifications of the upper part were done to improve the emotion projection. These modifications were to delete of the second level, where all the motors were supported. Instead, it was added a motor to move forward and backward the whole level.  Also, two plastic parts were used to maintain the upper shape.&lt;br /&gt;
&lt;br /&gt;
[[File:newPlatform.jpg|center|300px]]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=File:NewPlatform.jpg&amp;diff=17182</id>
		<title>File:NewPlatform.jpg</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=File:NewPlatform.jpg&amp;diff=17182"/>
				<updated>2014-11-18T16:38:26Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: New robotic platform&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;New robotic platform&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=File:FusionAdavance.png&amp;diff=17181</id>
		<title>File:FusionAdavance.png</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=File:FusionAdavance.png&amp;diff=17181"/>
				<updated>2014-11-18T15:48:45Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: Second platform&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Second platform&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=File:Triskar.JPG&amp;diff=17180</id>
		<title>File:Triskar.JPG</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=File:Triskar.JPG&amp;diff=17180"/>
				<updated>2014-11-18T15:36:38Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: First version of the robot&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;First version of the robot&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=17172</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=17172"/>
				<updated>2014-11-10T11:30:01Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are some [http://www.hardkernel.com/ ODROID] U2 and two U3 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh. To install ROS, please follow these [http://wiki.ros.org/hydro/Installation/UbuntuARM instructions]&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
====TP-LINK WN725n v2====&lt;br /&gt;
This device is not supported by linux, thus it should be compiled to the desired platform. To do this, you can follow this [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=1674 tutorial].&lt;br /&gt;
&lt;br /&gt;
Sometimes it is not enough just the tutorial, because the system does not add the wlanX to the configuration file. Thus, it is necessary to modify the following file:&lt;br /&gt;
 sudo vim /etc/network/interfaces&lt;br /&gt;
In the file add the following lines:&lt;br /&gt;
 auto wlanX&lt;br /&gt;
 iface wlanX inet dhap&lt;br /&gt;
Where X is the number of the net, that could be found using the this line:&lt;br /&gt;
 iwconfig&lt;br /&gt;
After saving the changes on the file, it should be restart the connection:&lt;br /&gt;
 sudo /etc/init.d/netwroking restart&lt;br /&gt;
If this command does not work reboot the system:&lt;br /&gt;
 sudo reboot now&lt;br /&gt;
To see the available wifii networks:&lt;br /&gt;
 sudo iwlist wlanX scanning&lt;br /&gt;
If this does not work try&lt;br /&gt;
  sudo ifconfig wlanX up&lt;br /&gt;
Then, redo the above steps.&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
See problems below.&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after [http://forum.odroid.com/viewtopic.php?f=29&amp;amp;t=1516 recompiling the kernel], disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;br /&gt;
&lt;br /&gt;
==Configuring a Network==&lt;br /&gt;
This steps are useful when you are working via ssh or just finished to install ubuntu on the sd card. &lt;br /&gt;
&lt;br /&gt;
Open the file /etc/networ/interfaces and add the following lines:&lt;br /&gt;
&lt;br /&gt;
 iface wlanX inet static&lt;br /&gt;
 address &amp;quot;ip address&amp;quot;&lt;br /&gt;
 wpa-ssid &amp;quot;name of the network&amp;quot;&lt;br /&gt;
 wpa-psk &amp;quot;password of the network&amp;quot;&lt;br /&gt;
&lt;br /&gt;
Change X for the correct interface.&lt;br /&gt;
&lt;br /&gt;
==Communication with Arduino ==&lt;br /&gt;
When the odroid is stated up the port, in which arduino is plug, does not have all the permissions to be access by the odroid's os. To solve this without connect and add the permissions to the proper port, it could be add a script that is executed each time that the odroid is start up. To do this should be created a file xx.sh, where xx is the name of the file. In this file should be written:&lt;br /&gt;
&lt;br /&gt;
 echo &amp;quot;odroid&amp;quot; | sudo chmod 777 /dev/ttyACMX&lt;br /&gt;
&lt;br /&gt;
Where X is the number in which arduino is plugged&lt;br /&gt;
&lt;br /&gt;
The file xx.sh should be moved to /etc/init.d/, and then run the following command:&lt;br /&gt;
&lt;br /&gt;
 update-rc.d xx.sh&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=17171</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=17171"/>
				<updated>2014-11-10T11:28:09Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are some [http://www.hardkernel.com/ ODROID] U2 and two U3 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh. To install ROS, please follow these [http://wiki.ros.org/hydro/Installation/UbuntuARM instructions]&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
====TP-LINK WN725n v2====&lt;br /&gt;
This device is not supported by linux, thus it should be compiled to the desired platform. To do this, you can follow this [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=1674 tutorial].&lt;br /&gt;
&lt;br /&gt;
Sometimes it is not enough just the tutorial, because the system does not add the wlanX to the configuration file. Thus, it is necessary to modify the following file:&lt;br /&gt;
 sudo vim /etc/network/interfaces&lt;br /&gt;
In the file add the following lines:&lt;br /&gt;
 auto wlanX&lt;br /&gt;
 iface wlanX inet dhap&lt;br /&gt;
Where X is the number of the net, that could be found using the this line:&lt;br /&gt;
 iwconfig&lt;br /&gt;
After saving the changes on the file, it should be restart the connection:&lt;br /&gt;
 sudo /etc/init.d/netwroking restart&lt;br /&gt;
If this command does not work reboot the system:&lt;br /&gt;
 sudo reboot now&lt;br /&gt;
To see the available wifii networks:&lt;br /&gt;
 sudo iwlist wlanX scanning&lt;br /&gt;
If this does not work try&lt;br /&gt;
  sudo ifconfig wlanX up&lt;br /&gt;
Then, redo the above steps.&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
See problems below.&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after [http://forum.odroid.com/viewtopic.php?f=29&amp;amp;t=1516 recompiling the kernel], disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;br /&gt;
&lt;br /&gt;
==Configuring a Network==&lt;br /&gt;
This steps are useful when you are working via ssh or just finished to install ubuntu on the sd card. &lt;br /&gt;
&lt;br /&gt;
Open the file /etc/networ/interfaces and add the following lines:&lt;br /&gt;
&lt;br /&gt;
iface wlanX inet static&lt;br /&gt;
address &amp;quot;ip address&amp;quot;&lt;br /&gt;
wpa-ssid &amp;quot;name of the network&amp;quot;&lt;br /&gt;
wpa-psk &amp;quot;password of the network&amp;quot;&lt;br /&gt;
&lt;br /&gt;
Change X for the correct interface.&lt;br /&gt;
&lt;br /&gt;
==Communication with Arduino ==&lt;br /&gt;
When the odroid is stated up the port, in which arduino is plug, does not have all the permissions to be access by the odroid's os. To solve this without connect and add the permissions to the proper port, it could be add a script that is executed each time that the odroid is start up. To do this should be created a file xx.sh, where xx is the name of the file. In this file should be written:&lt;br /&gt;
&lt;br /&gt;
echo &amp;quot;odroid&amp;quot; | sudo chmod 777 /dev/ttyACM0&lt;br /&gt;
&lt;br /&gt;
The file xx.sh should be moved to /etc/init.d/, and then run the following command:&lt;br /&gt;
&lt;br /&gt;
update-rc.d xx.sh&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17170</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17170"/>
				<updated>2014-11-10T11:16:30Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
==Robotic Platform==&lt;br /&gt;
It has been developed a robotic platform to test how to convey emotions without using human features. The platform has suffered significant changes since the partial version used in the Researches' night 2013 to the Maker fair 2014. The last version of the platform uses as a processor an Odroid, and as a micro-controller an Arduino Due.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17082</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17082"/>
				<updated>2014-09-16T09:30:34Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* First Platform to test How to Show Emotions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[https://www.youtube.com/watch?v=AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17081</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=17081"/>
				<updated>2014-09-16T09:29:23Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* First Platform to test How to Show Emotions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
[http://youtu.be/AXAglJKLwbI: Final Video]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=17000</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=17000"/>
				<updated>2014-07-10T14:31:53Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* TP-LINK WN725n v2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are two [http://www.hardkernel.com/ ODROID] U2 and two U3 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh. To install ROS, please follow this [http://wiki.ros.org/hydro/Installation/UbuntuARM instructions]&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
====TP-LINK WN725n v2====&lt;br /&gt;
This device is not supported by linux, thus it should be compiled to the desire platform. To do this, it could be follow this [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=1674 tutorial].&lt;br /&gt;
&lt;br /&gt;
Sometimes it is not enough just the tutorial, because the system does not add the wlanX to the configuration file. Thus, it is necessary to modify the following file:&lt;br /&gt;
 sudo vim /etc/network/interfaces&lt;br /&gt;
In the file add the following lines:&lt;br /&gt;
 auto wlanX&lt;br /&gt;
 iface wlanX inet dhap&lt;br /&gt;
Where X is the number of the net, that could be found using the this line:&lt;br /&gt;
 iwconfig&lt;br /&gt;
After saving the changes on the file, it should be restart the connection:&lt;br /&gt;
 sudo /etc/init.d/netwroking restart&lt;br /&gt;
If this command does not work reboot the system:&lt;br /&gt;
 sudo reboot now&lt;br /&gt;
To see the available wifii networks:&lt;br /&gt;
 sudo iwlist wlanX scanning&lt;br /&gt;
If this does not work try&lt;br /&gt;
  sudo ifconfig wlanX up&lt;br /&gt;
Then, redo the above steps&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after [http://forum.odroid.com/viewtopic.php?f=29&amp;amp;t=1516 recompiling the kernel], disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16999</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16999"/>
				<updated>2014-07-09T14:25:46Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Realtek RTL8192CU */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are two [http://www.hardkernel.com/ ODROID] U2 and two U3 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh. To install ROS, please follow this [http://wiki.ros.org/hydro/Installation/UbuntuARM instructions]&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
====TP-LINK WN725n v2====&lt;br /&gt;
This device is not supported by linux, thus it should be compiled to the desire platform. To do this, it could be follow this [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=1674 tutorial].&lt;br /&gt;
&lt;br /&gt;
Sometimes it is not enough just the tutorial, because the system does not add the wlanX to the configuration file. Thus, it is necessary to modify the following file:&lt;br /&gt;
 sudo vim /etc/network/interfaces&lt;br /&gt;
In the file add the following lines:&lt;br /&gt;
 auto wlanX&lt;br /&gt;
 iface wlanX inet dhap&lt;br /&gt;
Where X is the number of the net, that could be found using the this line:&lt;br /&gt;
 iwconfig&lt;br /&gt;
After saving the changes on the file, it should be restart the connection:&lt;br /&gt;
 sudo /etc/init.d/netwroking restart&lt;br /&gt;
To see the available wifii networks:&lt;br /&gt;
 sudo iwlist wlanX scanning&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after [http://forum.odroid.com/viewtopic.php?f=29&amp;amp;t=1516 recompiling the kernel], disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16922</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16922"/>
				<updated>2014-05-09T09:45:51Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Conferences */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p. [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6681511 link]&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16921</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16921"/>
				<updated>2014-05-09T09:44:25Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London. [http://doc.gold.ac.uk/aisb50/AISB50-S19/AISB50-S19-Angel-paper.pdf Link]&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16910</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16910"/>
				<updated>2014-04-15T11:51:37Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Tested OS and versions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are two [http://www.hardkernel.com/ ODROID] U2 and two U3 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh. To install ROS, please follow this [http://wiki.ros.org/hydro/Installation/UbuntuARM instructions]&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
====TP-LINK WN725n v2====&lt;br /&gt;
This device is not supported by linux, thus it should be compiled to the desire platform. To do this, it could be follow this [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=1674 tutorial].&lt;br /&gt;
&lt;br /&gt;
Sometimes it is not enough just the tutorial, because the system does not add the wlanX to the configuration file. Thus, it is necessary to modify the following file:&lt;br /&gt;
 sudo vim /etc/network/interfaces&lt;br /&gt;
In the file add the following lines:&lt;br /&gt;
 auto wlanX&lt;br /&gt;
 iface wlanX inet dhap&lt;br /&gt;
Where X is the number of the net, that could be found using the this line:&lt;br /&gt;
 iwconfig&lt;br /&gt;
After saving the changes on the file, it should be restart the connection:&lt;br /&gt;
 sudo /etc/init.d/netwroking restart&lt;br /&gt;
To see the available wifii networks:&lt;br /&gt;
 sudo iwlist wlanX scanning&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after recompiling the kernel, disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16909</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16909"/>
				<updated>2014-04-15T11:49:36Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are two [http://www.hardkernel.com/ ODROID] U2 and two U3 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh.&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
====TP-LINK WN725n v2====&lt;br /&gt;
This device is not supported by linux, thus it should be compiled to the desire platform. To do this, it could be follow this [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=1674 tutorial].&lt;br /&gt;
&lt;br /&gt;
Sometimes it is not enough just the tutorial, because the system does not add the wlanX to the configuration file. Thus, it is necessary to modify the following file:&lt;br /&gt;
 sudo vim /etc/network/interfaces&lt;br /&gt;
In the file add the following lines:&lt;br /&gt;
 auto wlanX&lt;br /&gt;
 iface wlanX inet dhap&lt;br /&gt;
Where X is the number of the net, that could be found using the this line:&lt;br /&gt;
 iwconfig&lt;br /&gt;
After saving the changes on the file, it should be restart the connection:&lt;br /&gt;
 sudo /etc/init.d/netwroking restart&lt;br /&gt;
To see the available wifii networks:&lt;br /&gt;
 sudo iwlist wlanX scanning&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after recompiling the kernel, disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16908</id>
		<title>ODROID</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=ODROID&amp;diff=16908"/>
				<updated>2014-04-15T10:44:31Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Tested OS and versions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;There are two [http://www.hardkernel.com/ ODROID] U2 for development of small robots. Here are details and suggestions for their use.&lt;br /&gt;
&lt;br /&gt;
=== Useful How-Tos ===&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=52&amp;amp;t=81 Kernel recompiling process for the ODROID-U2]; it is necessary to do this if you ever wish to install external/out-of-tree drivers, as the &amp;quot;official package&amp;quot; is missing several critical kernel headers.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested OS and versions ===&lt;br /&gt;
* [http://odroid.in/ubuntu-server-13.05/ Lubuntu 13.05 Server] does not have a graphical interface. It has already installed ssh.&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=1193 Xubuntu 13.04]  for armhf architectures, released by HardKernel with their [https://github.com/hardkernel/linux modified Linux kernel, version 3.0.75]. The kernel can be recompiled for using 3D video acceleration but this would prevent USB video cameras to work, so it's useless for robot development (even more so since OpenCL can't be even used).&lt;br /&gt;
* [http://forum.odroid.com/viewtopic.php?f=8&amp;amp;t=12 Linaro Ubuntu 12.11] with HardKernel's Linux kernel, version 3.0.60. Grab [http://dn.odroid.com/Ubuntu_U2/20130125/ this image] dated 25-01-2013, flash it on your microSD card and then apply the latest point kernel point update (at the time of writing, it is version 3.0.63, dated 13-02-2013) following the instructions given on the thread's opening post.&lt;br /&gt;
&lt;br /&gt;
'''All OSes have wireless networking issues, please read below for more information.'''&lt;br /&gt;
&lt;br /&gt;
=== Tested I/O devices ===&lt;br /&gt;
&lt;br /&gt;
==== Monitors ====&lt;br /&gt;
Monitors need a native HDMI interface to work with ODROIDs due to the strict requirements of the Exynos system-on-chip; for the same reason, external HDMI-to-DVI adapters are also '''not''' recommended, as you could result having a blank image screen ([http://odroid.us/mediawiki/index.php?title=Troubleshooting#My_Device_boots_.28The_alive_led_blinks.29.2C_but_it_doesnt_show_anything_on_the_display source] )&lt;br /&gt;
&lt;br /&gt;
This is a list of monitors and TV screens with native HDMI ports that are known to work.&lt;br /&gt;
* Sony KDL-32V4500&lt;br /&gt;
* Samsung Syncmaster XL2370 HD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wireless communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Tested wired communication ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Unresolved issues ===&lt;br /&gt;
&lt;br /&gt;
==== Wireless Networking (all tested OSes) ====&lt;br /&gt;
Wireless communications seem to be very troublesome, as the network interfaces tend to lose or drop a lot of packages, both in RX and TX, to the point that a reliable SSH connection cannot be established. This happens at least with the &amp;quot;officially supported&amp;quot; wireless NIC (with a Realtek RTL8191SU chipset), even if the wifi signal strength is excellent.&lt;br /&gt;
&lt;br /&gt;
A curious aspect of this issue is that only incoming packets get dropped a lot, while outgoing packets are very modestly affected.&lt;br /&gt;
&lt;br /&gt;
This being said, the RTL8191SU adapter loses some packets along the way also while being connected to other computers, so maybe it's really not the best USB wifi adapter ''ever''.&lt;br /&gt;
&lt;br /&gt;
  root@odroid:~# ifconfig wlan6&lt;br /&gt;
  wlan6     Link encap:Ethernet  HWaddr **:**:**:**:**:**&lt;br /&gt;
           inet addr:192.168.1.60  Bcast:192.168.1.255  Mask:255.255.255.0&lt;br /&gt;
           inet6 addr: fe80::ee1a:59ff:fe0e:f122/64 Scope:Link&lt;br /&gt;
           UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1&lt;br /&gt;
           RX packets:453 errors:0 dropped:600 overruns:0 frame:0&lt;br /&gt;
           TX packets:292 errors:0 dropped:7 overruns:0 carrier:0&lt;br /&gt;
           collisions:0 txqueuelen:1000&lt;br /&gt;
           RX bytes:79264 (79.2 KB)  TX bytes:35116 (35.1 KB)&lt;br /&gt;
&lt;br /&gt;
  odroid@odroid:~$ iwconfig wlan6&lt;br /&gt;
  wlan6     IEEE 802.11bg  ESSID:&amp;quot;MyWirelessNet&amp;quot;  Nickname:&amp;quot;&amp;lt;WIFI@REALTEK&amp;gt;&amp;quot;&lt;br /&gt;
          Mode:Managed  Frequency:2.437 GHz  Access Point: **:**:**:**:**:**&lt;br /&gt;
          Bit Rate:54 Mb/s   Sensitivity:0/0  &lt;br /&gt;
          Retry:off   RTS thr:off   Fragment thr:off&lt;br /&gt;
          Encryption key:****-****-****-****-****-****-****-****   Security mode:open&lt;br /&gt;
          Power Management:off&lt;br /&gt;
          Link Quality=100/100  Signal level=100/100  Noise level=0/100&lt;br /&gt;
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0&lt;br /&gt;
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== Realtek RTL8192CU =====&lt;br /&gt;
This chipset seems not to be even detected by the stock HK kernel (all versions of it), but it is possible to use it after recompiling the kernel, disabling the in-kernel module for the NIC, and integrating [http://www.realtek.com/downloads/downloadsView.aspx?Langid=1&amp;amp;PNid=21&amp;amp;PFid=48&amp;amp;Level=5&amp;amp;Conn=4&amp;amp;DownTypeID=3&amp;amp;GetDown=false&amp;amp;Downloads=true Realtek's official driver] for this chipset inside it (instructions are included in the package).&lt;br /&gt;
&lt;br /&gt;
However, you still get a huge packet loss rate, even if you disable the power saving feature of the driver by loading the kernel module with&lt;br /&gt;
  sudo modprobe 8192cu rtw_power_mgnt=0&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Solved issues ===&lt;br /&gt;
&lt;br /&gt;
==== lshw glitches (all tested OSes) ====&lt;br /&gt;
With the stock kernel the lshw command seems to be partly broken, but this is resolved by calling it with&lt;br /&gt;
  sudo lshw -disable dmi&lt;br /&gt;
although this will prevent the detection of certain features of the board. Another way of fixing this issue is to recompile the kernel by hand (see the how-to linked above).&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16882</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16882"/>
				<updated>2014-04-11T10:49:01Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=julianmauricio.angel@polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]] is a system which allows a robot to be a theatre's actor with the ability to interact naturally with humans and other robots autonomously.&lt;br /&gt;
&lt;br /&gt;
==Interest==&lt;br /&gt;
*Autonomous Robotics&lt;br /&gt;
*Human Robot Interaction&lt;br /&gt;
&lt;br /&gt;
==Societies==&lt;br /&gt;
IEEE&lt;br /&gt;
* RAS&lt;br /&gt;
* Computer Engineering&lt;br /&gt;
.&lt;br /&gt;
==Conferences==&lt;br /&gt;
* Affective Computing and Intelligent Interaction 2013&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics&lt;br /&gt;
&lt;br /&gt;
==Papers==&lt;br /&gt;
[http://scholar.google.it/citations?user=QgHKdBwAAAAJ&amp;amp;hl=en Google schoolar]&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16846</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16846"/>
				<updated>2014-03-26T11:01:51Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Conferences */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London.&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16845</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16845"/>
				<updated>2014-03-26T10:59:51Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Conferences */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
* Studying People's Emotional Responses to Robot's Movements. Julián M. Angel F. and Andrea Bonarini. 2014. In 3rd International Symposium on New Frontiers in Human-Robot Interaction at AISB'14, London.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16613</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16613"/>
				<updated>2013-11-13T22:12:54Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16612</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16612"/>
				<updated>2013-11-13T22:12:25Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
&lt;br /&gt;
== Software Architecture ==&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16611</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16611"/>
				<updated>2013-11-13T22:10:59Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA RoboAct: Simple platform showing emotion I] &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o RoboAct: Simple platform showing emotion II]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ RoboAct: Simple platform showing emotion III]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16610</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16610"/>
				<updated>2013-11-13T22:08:27Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== Videos ==&lt;br /&gt;
=== First Platform to test How to Show Emotions ===&lt;br /&gt;
[http://www.youtube.com/watch?v=KupoEzVlgXA] RoboAct: Simple platform showing emotion I &lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qi-oSQMKi2o] RoboAct: Simple platform showing emotion II&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=-bEE2i4IvrQ] RoboAct: Simple platform showing emotion III&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16596</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16596"/>
				<updated>2013-10-16T10:35:43Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=julianmauricio.angel@polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]] is a system which allows a robot to be a theatre’s actor with the ability to interact naturally with humans and other robots autonomously&lt;br /&gt;
&lt;br /&gt;
==Interest==&lt;br /&gt;
*Autonomous Robotics&lt;br /&gt;
*Human Robot Interaction&lt;br /&gt;
.&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics&lt;br /&gt;
&lt;br /&gt;
==Papers==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16595</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16595"/>
				<updated>2013-10-16T10:35:29Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: /* Papers */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
== Papers ==&lt;br /&gt;
=== Conferences ===&lt;br /&gt;
* Towards an Autonomous Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Affective Computing and Intelligent Interaction (ACII) 2013. IEEE COMPUTER SOC, p. 689-694 p.&lt;br /&gt;
* TheatreBot: A Software Architecture for a Theatrical Robot. Julián M. Angel F. and Andrea Bonarini. 2013. Taros 2013.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16493</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16493"/>
				<updated>2013-07-03T09:27:35Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots. People are used to think at theatre as a repetitive show, and essential points that make theatre a lively art are often forgotten[1]:&lt;br /&gt;
*During a theatre performance actors do not have a second chance to perform in front of the same audience. If an actor fails remembering a line, or he/she does not show a believable character, the audience are going to get a bad impression of the play.&lt;br /&gt;
*Each performance is unique. No matter how much effort actors do to repeat each time the same performance, subtle changes could be seen: actors' and objects' stage position, actors' mood and, more importantly, audience's attitude. &lt;br /&gt;
*Audience's attitude affects actors. Actor could hear laughs, coughs, silence, and could even feel the tension in the audience. This could eager or discourage actors, affecting the whole performance.&lt;br /&gt;
*The performance outcome does not rely on one person. Good outcome comes from correct collaboration and coordination of playwriter, director, technical people and performers. In the specific performers case, they must work as a unity and show to the public a coherent story. &lt;br /&gt;
&lt;br /&gt;
Theatre is an excellent framework to focus on specific abilities and features to produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.), which are provided by the script and the director:&lt;br /&gt;
*The play script contains all the necessary information: actions, coordination cues, dialogues, and characters attitude. &lt;br /&gt;
*Since the script is known before any representation, rehearsals can be done to get used with objects' and performers' positions.&lt;br /&gt;
*The stage space is discretized to facilitate directors to give instructions, and actors to remember their positions.&lt;br /&gt;
*Actors should basically take one out of eight preset orientations during a performance.&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16466</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16466"/>
				<updated>2013-06-06T10:52:09Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to social situations. If someone does not respond in an expected way, he/she is margined by the others. Thus, robots that interact with humans in everyday life places, such as home, office, classroom and public spaces, should not only accomplish their task, but also be accepted by humans, which means that they feel comfortable to interact with robots. As a consequence, social robots must have the capacity to show emotions and behave in a socially correct way. However, building robots that could accomplish their tasks and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, together with all the traditional problems to perform a given task. This makes crucial to find a real environment that allows focusing the research efforts on the production of effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.). &lt;br /&gt;
Several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constraints that make the actor know what to say, how to react, where is expected to be the objects and other actors. All of this information is given before hand in a script. However, the few works [9]-[19]  that have been put robots on stage in the last decade have used theatre as an environment to create robots for entertainment, without worrying about the use theatre actor’s training theories to make the robot project emotions to the audience. TheatreBot aims at exploiting theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently is happening. The system and platform will be designed to allow extension to other application areas where showing emotions are important as in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, the concept of emotional state is used to add emotional features to action performance, so obtaining a full range of possibilities to show emotional and social interaction. &lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
The following image shows the proposed architecture:&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
As could be seen there are two kinds of lines: continues, which represents information that are used for others modules. On the other hand the dash means that the information is not mandatory, thus the other module could used or not that information. If none of the modules accept any suggestion of others modules, then the system  could be seen as traditional action decision system without any kind of emotion addition.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16454</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16454"/>
				<updated>2013-06-06T09:36:00Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=julianmauricio.angel@polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]] is a system which allows a robot to be a theatre’s actor with the ability to interact naturally with humans and other robots autonomously&lt;br /&gt;
&lt;br /&gt;
==Interest==&lt;br /&gt;
*Autonomous Robotics&lt;br /&gt;
*Human Robot Interaction&lt;br /&gt;
.&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16443</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16443"/>
				<updated>2013-06-05T14:14:07Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to specific situations. If someone does not response in the expected way, is margined by the others. Thus, robots that interact with humans outside of manufacture places, as home, office, classroom and public spaces. They should not only accomplish their task, but also be accepted by humans, which means that them feel comfortable of robots presence. As a consequence, social robots must have the capacity to show emotions. However, build robots than could accomplish its task and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, and all the traditional problems to do a given task. This makes crucial find an environment that allow us research on produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.).  &lt;br /&gt;
&lt;br /&gt;
Although several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constrains as known what to say, how react, where is expected to be the object and other actors, all of these given before hand in a script, the few works [9]-[19] that have been doing in the last decade have been used theatre as an environment to create robots that could accomplish entertainment task, without worrying about the use theatre actor’s training theories to project emotions to the audience. As a consequence, TheatreBot pretends exploit theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently has being happening. The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, it is used the concept of emotional state to add emotional features on actions, avoiding the hard-coding of all actions that robot should execute.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
[[Image:TheatreBotArchitecture.png|900px]]&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
*[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
*[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, *Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
*[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
*[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
*[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
*[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
*[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
*[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
*[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
*[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
*[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
*[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
*[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
*[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
*[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
*[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
*[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
*[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16442</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16442"/>
				<updated>2013-06-05T14:02:16Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to specific situations. If someone does not response in the expected way, is margined by the others. Thus, robots that interact with humans outside of manufacture places, as home, office, classroom and public spaces. They should not only accomplish their task, but also be accepted by humans, which means that them feel comfortable of robots presence. As a consequence, social robots must have the capacity to show emotions. However, build robots than could accomplish its task and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, and all the traditional problems to do a given task. This makes crucial find an environment that allow us research on produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.).  &lt;br /&gt;
&lt;br /&gt;
Although several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constrains as known what to say, how react, where is expected to be the object and other actors, all of these given before hand in a script, the few works [9]-[19] that have been doing in the last decade have been used theatre as an environment to create robots that could accomplish entertainment task, without worrying about the use theatre actor’s training theories to project emotions to the audience. As a consequence, TheatreBot pretends exploit theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently has being happening. The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, it is used the concept of emotional state to add emotional features on actions, avoiding the hard-coding of all actions that robot should execute.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
{{TheatreBotArchitecture.png}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16441</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16441"/>
				<updated>2013-06-05T14:01:54Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to specific situations. If someone does not response in the expected way, is margined by the others. Thus, robots that interact with humans outside of manufacture places, as home, office, classroom and public spaces. They should not only accomplish their task, but also be accepted by humans, which means that them feel comfortable of robots presence. As a consequence, social robots must have the capacity to show emotions. However, build robots than could accomplish its task and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, and all the traditional problems to do a given task. This makes crucial find an environment that allow us research on produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.).  &lt;br /&gt;
&lt;br /&gt;
Although several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constrains as known what to say, how react, where is expected to be the object and other actors, all of these given before hand in a script, the few works [9]-[19] that have been doing in the last decade have been used theatre as an environment to create robots that could accomplish entertainment task, without worrying about the use theatre actor’s training theories to project emotions to the audience. As a consequence, TheatreBot pretends exploit theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently has being happening. The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, it is used the concept of emotional state to add emotional features on actions, avoiding the hard-coding of all actions that robot should execute.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
[[File:TheatreBotArchitecture.png]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16440</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16440"/>
				<updated>2013-06-05T14:01:36Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to specific situations. If someone does not response in the expected way, is margined by the others. Thus, robots that interact with humans outside of manufacture places, as home, office, classroom and public spaces. They should not only accomplish their task, but also be accepted by humans, which means that them feel comfortable of robots presence. As a consequence, social robots must have the capacity to show emotions. However, build robots than could accomplish its task and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, and all the traditional problems to do a given task. This makes crucial find an environment that allow us research on produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.).  &lt;br /&gt;
&lt;br /&gt;
Although several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constrains as known what to say, how react, where is expected to be the object and other actors, all of these given before hand in a script, the few works [9]-[19] that have been doing in the last decade have been used theatre as an environment to create robots that could accomplish entertainment task, without worrying about the use theatre actor’s training theories to project emotions to the audience. As a consequence, TheatreBot pretends exploit theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently has being happening. The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, it is used the concept of emotional state to add emotional features on actions, avoiding the hard-coding of all actions that robot should execute.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
[[File:TheatreBotArchitecture.png?20*50]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16439</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16439"/>
				<updated>2013-06-05T14:00:11Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to specific situations. If someone does not response in the expected way, is margined by the others. Thus, robots that interact with humans outside of manufacture places, as home, office, classroom and public spaces. They should not only accomplish their task, but also be accepted by humans, which means that them feel comfortable of robots presence. As a consequence, social robots must have the capacity to show emotions. However, build robots than could accomplish its task and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, and all the traditional problems to do a given task. This makes crucial find an environment that allow us research on produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.).  &lt;br /&gt;
&lt;br /&gt;
Although several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constrains as known what to say, how react, where is expected to be the object and other actors, all of these given before hand in a script, the few works [9]-[19] that have been doing in the last decade have been used theatre as an environment to create robots that could accomplish entertainment task, without worrying about the use theatre actor’s training theories to project emotions to the audience. As a consequence, TheatreBot pretends exploit theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently has being happening. The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, it is used the concept of emotional state to add emotional features on actions, avoiding the hard-coding of all actions that robot should execute.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
*Motivation&lt;br /&gt;
&lt;br /&gt;
[[File:TheatreBotArchitecture.png]]&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=File:TheatreBotArchitecture.png&amp;diff=16438</id>
		<title>File:TheatreBotArchitecture.png</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=File:TheatreBotArchitecture.png&amp;diff=16438"/>
				<updated>2013-06-05T13:59:36Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: FirstImage&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;FirstImage&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16437</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16437"/>
				<updated>2013-06-05T13:55:33Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Human social interactions are based on the correct response to specific situations. If someone does not response in the expected way, is margined by the others. Thus, robots that interact with humans outside of manufacture places, as home, office, classroom and public spaces. They should not only accomplish their task, but also be accepted by humans, which means that them feel comfortable of robots presence. As a consequence, social robots must have the capacity to show emotions. However, build robots than could accomplish its task and show emotion is not an easy job due to the difficulty to select the correct emotion, show the emotion in a way that could be understandable by humans, and all the traditional problems to do a given task. This makes crucial find an environment that allow us research on produce effective social and emotional interaction, without the need for other abilities (e.g., emotion detection, status detection, person recognition, etc.).  &lt;br /&gt;
&lt;br /&gt;
Although several researches have suggested that theatre could be an excellent place to test social and emotional abilities [4]-[8], due to theatre constrains as known what to say, how react, where is expected to be the object and other actors, all of these given before hand in a script, the few works [9]-[19] that have been doing in the last decade have been used theatre as an environment to create robots that could accomplish entertainment task, without worrying about the use theatre actor’s training theories to project emotions to the audience. As a consequence, TheatreBot pretends exploit theatre constraints to build a robotic platform and software that allow the robot to be an actor in theatre and not just as prop, as currently has being happening. The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a relational social and social model of the world to represent its character’s feelings and belief about the world. Besides, it is used the concept of emotional state to add emotional features on actions, avoiding the hard-coding of all actions that robot should execute.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
*Belief&lt;br /&gt;
*Action Decision&lt;br /&gt;
*Action Modulation&lt;br /&gt;
*Description&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Theatre is considered as lively art [1]. Thanks to theatre characteristics, constraints and actors’ lessons, it is an excellent place to test coordination and expressiveness in robots, and actors training systems [1]–[3] can inspire the development of expressive robots.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
[1] E. Wilson and A. Goldfarb, Theatre: The Lively Art. McGraw-Hill Education, 2009.&lt;br /&gt;
[2] J. Cavanaugh, Acting Means Doing !!: Here Are All the Techniques You Need, Carrying You Confidently from Auditions Through Rehearsals - Blocking, Characterization - Into Performances, All the Way to Curtain Calls. CreateSpace, 2012.&lt;br /&gt;
[3] S. Genevieve, Delsarte System of Dramatic Expression. BiblioBazaar, 2009.&lt;br /&gt;
[4] G. Hoffman, “On stage: Robots as performers,” RSS 2011 Workshop on Human-Robot Interaction: Perspectives and Contributions to Robotics from the Human Sciences, 2009.&lt;br /&gt;
[5] C. Breazeal, A. Brooks, J. Gray, M. Hancher, C. Kidd, J. McBean, D. Stiehl, and J. Strickon, “Interactive robot theatre,” in Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003 IEEE/RSJ International Conference on, vol. 4, 2003, pp. 3648–3655 vol.3.&lt;br /&gt;
[6] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[7] D. V. Lu and W. D. Smart, “Human-robot interactions as theatre,” in RO-MAN 2011. IEEE, 2011, pp. 473–478.&lt;br /&gt;
[8] C. Pinhanez, “Computer theater,” in Proc. of the Eighth International Symposium on Electronic Arts (ISEA’97), 1997.&lt;br /&gt;
[9] C.-Y. Lin, L.-C. Cheng, C.-C. Huang, L.-W. Chuang, W.-C. Teng, C.- H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “Versatile humanoid robots for theatrical performances,” International Journal of Advanced Robotic Systems, 2013.&lt;br /&gt;
[10] H. Knight, S. Satkin, V. Ramakrishna, and S. Divvala, “A savvy robot standup comic: Online learning through audience tracking,” TEI 2011, January 2011.&lt;br /&gt;
[11] H. Knight, “Heather knight: la comedia de silicio,” TED Ideas Worth Spreading, December 2010. [Online]. Available: http://www.ted.com/ talks/heather knight silicon based comedy.html&lt;br /&gt;
[12] K. R. Wurst, “I comici roboti: Performing the lazzo of the statue from the commedia dell’arte,” in AAAI Mobile Robot Competition, 2002, pp. 124–128.&lt;br /&gt;
[13] A. Bruce, J. Knight, and I. R. Nourbakhsh, “Robot improv: Using drama to create believable agents,” in In AAAI Workshop Technical Report WS- 99-15 of the 8th Mobile Robot Competition and Exhibition. AAAI Press, Menlo, 2000, pp. 27–33.&lt;br /&gt;
[14] LAAS-CNRS, “Roboscopie, the robot takes the stage!” Internet. [Online]. Available: http://www.openrobots.org/wiki/roboscopie&lt;br /&gt;
[15] S. Lemaignan, M. Gharbi, J. Mainprice, M. Herrb, and R. Alami, “Roboscopie: a theatre performance for a human and a robot,” in&lt;br /&gt;
Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ser. HRI ’12. New York, NY, USA: ACM, 2012, pp. 427–428.&lt;br /&gt;
[16] C.-Y. Lin, C.-K. Tseng, W.-C. Teng, W. chen Lee, C.-H. Kuo, H.-Y. Gu, K.-L. Chung, and C.-S. Fahn, “The realization of robot theater: Humanoid&lt;br /&gt;
robots and theatric performance,” in International Conference on Advanced Robotics, 2009. ICAR 2009., 2009.&lt;br /&gt;
[17] G. Hoffman, R. Kubat, and C. Breazeal, “A hybrid control system for puppeteering a live robotic stage actor,” in RO-MAN 2008, M. Buss and&lt;br /&gt;
K. K¨uhnlenz, Eds. IEEE, 2008, pp. 354–359.&lt;br /&gt;
[18] Z. Par´e, “Robot drama research: from identification to synchronization,” in Proceedings of the 4th international conference on Social Robotics, ser. ICSR’12. Berlin, Heidelberg: Springer-Verlag, 2012, pp. 308–316.&lt;br /&gt;
[19] I. Torres, “Robots share the stage with human actors in osaka university’s ’robot theater project’,” Japandaily Press, February 2013.&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16433</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16433"/>
				<updated>2013-06-05T11:44:14Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=angel.fernandez@elet.polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]] is a system which allows a robot to be a theatre’s actor with the ability to interact naturally with humans and other robots autonomously&lt;br /&gt;
&lt;br /&gt;
==Interest==&lt;br /&gt;
*Autonomous Robotics&lt;br /&gt;
*Human Robot Interaction&lt;br /&gt;
.&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16185</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16185"/>
				<updated>2013-04-11T09:54:32Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=angel.fernandez@elet.polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]] is a system which allows a robot to be a theatre’s actor with the ability to interact naturally with humans and other robots autonomously&lt;br /&gt;
.&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16184</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16184"/>
				<updated>2013-04-11T09:51:32Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Emotions must be projected by theatre's actors to their whole audience to make this last belief in the played character and to engage them in the play: this same principle is used in effective social relations. A theatrical robot actor should be built with the objective to make the robot perform as well as human actors and it should have a simple interface to enable untrained people to give it basic instructions that it can interpret to effectively play its role in the piece. Theatre gives us the constraint environment to focus on social part: emotion projection and action intention. This work is focused on the development of a system and platform that fulfill all necessary specifications for this robot actor. &lt;br /&gt;
&lt;br /&gt;
The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a social model of the world to represent its character’s feelings and belief about the world.  Moreover, it uses the concept of emotional state to add emotional features on actions that should be performed according to the script and director’s directions.&lt;br /&gt;
&lt;br /&gt;
Roughly, the software architecture is compound by the following sub-systems:&lt;br /&gt;
* ''Emotional''&lt;br /&gt;
* ''Action Decision''&lt;br /&gt;
* ''Action Modulation''&lt;br /&gt;
* ''Features Description''&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Although theatre involves many elements, the main one is audience’s imagination making the difference among theatre, television and movies. Additionally, the performance of each play is made live, making each presentation unique and unrepeatable. Theatre’s actors have to embody characters that have its personal life, temper, and own way to express his self: voice intonation, words, and event movements. Thus, actors’ challenge is convince the spectators that the characters that they are portraying are real. As a consequence, actors must to learn how bring to life characters, which includes think about their personality, likes and dislikes, and most important how the character shows emotions through his movements. These movements must be coherent with the moves of the other characters and work in harmony to make the play be a successful.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16183</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16183"/>
				<updated>2013-04-11T09:45:21Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Emotions must be projected by theatre's actors to their whole audience to make this last belief in the played character and to engage them in the play: this same principle is used in effective social relations. A theatrical robot actor should be built with the objective to make the robot perform as well as human actors and it should have a simple interface to enable untrained people to give it basic instructions that it can interpret to effectively play its role in the piece. Theatre gives us the constraint environment to focus on social part: emotion projection and action intention. This work is focused on the development of a system and platform that fulfill all necessary specifications for this robot actor. &lt;br /&gt;
&lt;br /&gt;
The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a social model of the world to represent its character’s feelings and belief about the world.  Moreover, it uses the concept of emotional state to add emotional features on actions that should be performed according to the script and director’s directions.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Although theatre involves many elements, the main one is audience’s imagination making the difference among theatre, television and movies. Additionally, the performance of each play is made live, making each presentation unique and unrepeatable. Theatre’s actors have to embody characters that have its personal life, temper, and own way to express his self: voice intonation, words, and event movements. Thus, actors’ challenge is convince the spectators that the characters that they are portraying are real. As a consequence, actors must to learn how bring to life characters, which includes think about their personality, likes and dislikes, and most important how the character shows emotions through his movements. These movements must be coherent with the moves of the other characters and work in harmony to make the play be a successful.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16182</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16182"/>
				<updated>2013-04-11T09:45:06Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Emotions must be projected by theatre's actors to their whole audience to make this last belief in the played character and to engage them in the play: this same principle is used in effective social relations. A theatrical robot actor should be built with the objective to make the robot perform as well as human actors and it should have a simple interface to enable untrained people to give it basic instructions that it can interpret to effectively play its role in the piece. Theatre gives us the constraint environment to focus on social part: emotion projection and action intention. This work is focused on the development of a system and platform that fulfill all necessary specifications for this robot actor. &lt;br /&gt;
The system and platform have been thinking to allow extension to others areas where showing emotions are important as: in robot games and assistive robots. To accomplish this goal, the robot will use a social model of the world to represent its character’s feelings and belief about the world.  Moreover, it uses the concept of emotional state to add emotional features on actions that should be performed according to the script and director’s directions.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Although theatre involves many elements, the main one is audience’s imagination making the difference among theatre, television and movies. Additionally, the performance of each play is made live, making each presentation unique and unrepeatable. Theatre’s actors have to embody characters that have its personal life, temper, and own way to express his self: voice intonation, words, and event movements. Thus, actors’ challenge is convince the spectators that the characters that they are portraying are real. As a consequence, actors must to learn how bring to life characters, which includes think about their personality, likes and dislikes, and most important how the character shows emotions through his movements. These movements must be coherent with the moves of the other characters and work in harmony to make the play be a successful.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16181</id>
		<title>TheatreBot</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=TheatreBot&amp;diff=16181"/>
				<updated>2013-04-11T09:32:22Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Project&lt;br /&gt;
|title=TheatreBot&lt;br /&gt;
|short_descr=Aim of this project is to produce autonomous robots able to play on stage together with human actors, possibly improvising, or in any case facing the casualities occurring on the scene.&lt;br /&gt;
|coordinator=AndreaBonarini&lt;br /&gt;
|tutor=AndreaBonarini; &lt;br /&gt;
|students=JulianMauricioAngelFernandez&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|restopic=Robot development;&lt;br /&gt;
|start=2013/01/12&lt;br /&gt;
|end=2016/12/31&lt;br /&gt;
|status=Active&lt;br /&gt;
|level=PhD&lt;br /&gt;
|type=Thesis&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
Emotions must be projected by theatre's actors to their whole audience to make these last belief in the played character and to engage them in the play: this same principle is used in effective social relations. A theatrical robot actor should be built with the objective to make the robot perform as well as human actors and it should have a simple interface to enable untrained people to give it basic instructions that it can interpret to effectively play its role in the piece. This work is focused on the development of a system and platform that fulfill all necessary specifications for this robot actor. The development of a theatrical actor is a first step towards the implementation of effective autonomous robots in communication with people; the system and platform could be extended to others areas where showing emotions are important as in robot games and assistive robots. &lt;br /&gt;
&lt;br /&gt;
== Why Theatre?  ==&lt;br /&gt;
Although theatre involves many elements, the main one is audience’s imagination making the difference among theatre, television and movies. Additionally, the performance of each play is made live, making each presentation unique and unrepeatable. Theatre’s actors have to embody characters that have its personal life, temper, and own way to express his self: voice intonation, words, and event movements. Thus, actors’ challenge is convince the spectators that the characters that they are portraying are real. As a consequence, actors must to learn how bring to life characters, which includes think about their personality, likes and dislikes, and most important how the character shows emotions through his movements. These movements must be coherent with the moves of the other characters and work in harmony to make the play be a successful.&lt;br /&gt;
&lt;br /&gt;
== Papers ==&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16180</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=16180"/>
				<updated>2013-04-11T09:26:38Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=angel.fernandez@elet.polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]] is a system which allows a robot to be a theatre’s actor with the ability to interact naturally with humans and other robots autonomously&lt;br /&gt;
.&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=15899</id>
		<title>User:JulianMauricioAngelFernandez</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=User:JulianMauricioAngelFernandez&amp;diff=15899"/>
				<updated>2013-01-30T21:12:10Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{PhD&lt;br /&gt;
|category=PhD&lt;br /&gt;
|firstname=Julian Mauricio&lt;br /&gt;
|lastname=Angel Fernandez&lt;br /&gt;
|photo=296834_10150265312801143_1840200_n.jpg&lt;br /&gt;
|email=angel.fernandez@elet.polimi.it&lt;br /&gt;
|resarea=Robotics&lt;br /&gt;
|advisor=AndreaBonarini;&lt;br /&gt;
|projectpage= TheatreBot&lt;br /&gt;
|status=active&lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
I am a PhD student at Politecnico Di Milano. I have a Master’s Degree in Computer Engineering from La Universidad de los Andes, Bogota-Colombia. From the same university, I got my bachelor degree in Computer Engineering and Electronic Engineering.&lt;br /&gt;
&lt;br /&gt;
==PhD Thesis==&lt;br /&gt;
[[TheatreBot]]&lt;br /&gt;
&lt;br /&gt;
== Education ==&lt;br /&gt;
&lt;br /&gt;
===Undergraduate===&lt;br /&gt;
'''Bachelor of Science, Electronic Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_2008_primer_semestre/0000469.pdf A Comparison of Neural Networks to Detect Failures in MEMS]&lt;br /&gt;
&lt;br /&gt;
'''Bachelor of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: Application of Kalman and Particle filter in Mobile Robotics &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Graduate===&lt;br /&gt;
'''Master of Science, Systems and Computing Engineering'''&lt;br /&gt;
&lt;br /&gt;
[http://www.uniandes.edu.co Universidad de los Andes de Colombia]&lt;br /&gt;
&lt;br /&gt;
Final research project: [http://biblioteca.uniandes.edu.co/Tesis_120111200/676.pdf Cooperative Architecture for Multi-Agent Systems in Robotic Soccer (CAMASS)]&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	<entry>
		<id>https://airwiki.deib.polimi.it/index.php?title=File:296834_10150265312801143_1840200_n.jpg&amp;diff=15898</id>
		<title>File:296834 10150265312801143 1840200 n.jpg</title>
		<link rel="alternate" type="text/html" href="https://airwiki.deib.polimi.it/index.php?title=File:296834_10150265312801143_1840200_n.jpg&amp;diff=15898"/>
				<updated>2013-01-30T21:07:59Z</updated>
		
		<summary type="html">&lt;p&gt;JulianMauricioAngelFernandez: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>JulianMauricioAngelFernandez</name></author>	</entry>

	</feed>