<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/css" href="http://195.221.158.45/skins/common/feed.css?63"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="fr">
		<id>http://195.221.158.45/index.php?action=history&amp;feed=atom&amp;title=RSME_Digital_Ocean_2011</id>
		<title>RSME Digital Ocean 2011 - Historique des versions</title>
		<link rel="self" type="application/atom+xml" href="http://195.221.158.45/index.php?action=history&amp;feed=atom&amp;title=RSME_Digital_Ocean_2011"/>
		<link rel="alternate" type="text/html" href="http://195.221.158.45/index.php?title=RSME_Digital_Ocean_2011&amp;action=history"/>
		<updated>2026-04-12T02:26:35Z</updated>
		<subtitle>Historique pour cette page sur le wiki</subtitle>
		<generator>MediaWiki 1.10.0</generator>

	<entry>
		<id>http://195.221.158.45/index.php?title=RSME_Digital_Ocean_2011&amp;diff=7743&amp;oldid=prev</id>
		<title>Gi: Nouvelle page : __NOTOC__ &lt;center&gt;&lt;big&gt;&lt;h2&gt; Post-doctoral position   (FP7-European project: DigitalOcean RSME 2011-2012)  Multimodal human robot interaction in mixed reality environments: applicatio...</title>
		<link rel="alternate" type="text/html" href="http://195.221.158.45/index.php?title=RSME_Digital_Ocean_2011&amp;diff=7743&amp;oldid=prev"/>
				<updated>2011-02-07T16:12:28Z</updated>
		
		<summary type="html">&lt;p&gt;Nouvelle page : __NOTOC__ &amp;lt;center&amp;gt;&amp;lt;big&amp;gt;&amp;lt;h2&amp;gt; Post-doctoral position   (FP7-European project: DigitalOcean RSME 2011-2012)  Multimodal human robot interaction in mixed reality environments: applicatio...&lt;/p&gt;
&lt;p&gt;&lt;b&gt;Nouvelle page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;__NOTOC__&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;big&amp;gt;&amp;lt;h2&amp;gt;&lt;br /&gt;
Post-doctoral position&lt;br /&gt;
 &lt;br /&gt;
(FP7-European project: DigitalOcean RSME 2011-2012)&lt;br /&gt;
&lt;br /&gt;
Multimodal human robot interaction in mixed reality environments: application to the DigitalOcean project.&lt;br /&gt;
&amp;lt;/h2&amp;gt;&amp;lt;/big&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The IBISC laboratory (University of Evry) is currently offering a post-doctoral researcher to work on a EC&lt;br /&gt;
funded project entitled “DigitalOcean”.&lt;br /&gt;
&lt;br /&gt;
* Research topics: Virtual, Augmented and Mixed Reality, Telerobotics;&lt;br /&gt;
* Remuneration: 2100 ~ 2500 Euros per month – charges deduced;&lt;br /&gt;
* Starting from: March 2011.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This postdoc aims at investigating new combinations of multimodal human-robot interaction (audio, video and&lt;br /&gt;
haptic) in mixed reality in order to teleoperate underwater robots (ROV or LAUV). It will be divided in two&lt;br /&gt;
tasks: set up a mixed reality system and then work on a multimodal semi-immersive demonstrator.&lt;br /&gt;
&lt;br /&gt;
=== Mixed reality setup: ===&lt;br /&gt;
This task consists in studying and developing a mixed reality module in order to mix&lt;br /&gt;
actual images (sent by a camera mounted on the ROV) with a virtual 3D model of an underwater site to explore.&lt;br /&gt;
The main objective is therefore the visualization of mixed reality contents. To achieve this, we will first study&lt;br /&gt;
and propose algorithms for (underwater) camera calibration and real-time registration of virtual models over real&lt;br /&gt;
images. Next, we will design a generic software architecture that will be used for two types of demonstrators: a&lt;br /&gt;
web-based one (for general public) and a semi-immersive mixed reality platform for multimodal teleoperation.&lt;br /&gt;
&lt;br /&gt;
=== Multimodal human-robot interface (semi-immersive demonstrator): ===&lt;br /&gt;
The main objective of this task is to&lt;br /&gt;
propose a more intuitive and multimodal Human-robot Interface (audio, video and haptic). This multimodal&lt;br /&gt;
interface will provide a way to interact with a remote robot (ROV) and will enhance human sensory channels. A&lt;br /&gt;
stereoscopic display will provide visualization (of 3D virtual models of the underwater site and ROV) and will&lt;br /&gt;
allow to learn and prepare exploration path in a virtual submarine environment before the actual exploration. 3D&lt;br /&gt;
interaction using a haptic interface will allow the user to feel what the ROV feels during the exploration. The&lt;br /&gt;
simulated audio feedback of ROV motors during the real exploration is another sensory information that can&lt;br /&gt;
complement video and enhance human presence during the tele-exploration missions. The proposed&lt;br /&gt;
demonstrator will be a mixed reality mobile platform that can be transported for presentations, meetings,&lt;br /&gt;
conferences, etc. It may be connected to an internet network from any place to perform real/virtual or mixed&lt;br /&gt;
teleoperation missions.&lt;br /&gt;
&lt;br /&gt;
== Experience: ==&lt;br /&gt;
The candidate should have completed a PhD degree in Computer Science/Robotics/Computer Vision or&lt;br /&gt;
equivalent and have a good experience in Human-Computer Interaction or Human-Robot Interaction. He should&lt;br /&gt;
be motivated by cooperation with our European partners. Experience in haptic interaction is appreciated.&lt;br /&gt;
&lt;br /&gt;
== Applications: ==&lt;br /&gt;
Interested candidates should send an application, including their detailed CV and a cover letter to &lt;br /&gt;
&lt;br /&gt;
Mr. Samir Otmane ([mailto:Samir.otmane_(at)_ibisc.univ-evry.fr Samir.otmane_(at)_ibisc.univ-evry.fr]) until 15th of February 2011.&lt;br /&gt;
&lt;br /&gt;
Laboratoire IBISC, 40 rue de Pelvoux, 91020 Evry, FRANCE&lt;br /&gt;
&lt;br /&gt;
Phone: +33 1 69 47 75 92&lt;/div&gt;</summary>
		<author><name>Gi</name></author>	</entry>

	</feed>