Équipe AVR - Automatique Vision et Robotique

Sujets de stages

De Équipe AVR - Automatique Vision et Robotique
Sauter à la navigation Sauter à la recherche

Developing an Augmented Reality tool to visualize X-ray radiations

Internship offer

Located on the campus of Strasbourg’s University Hospital, the research group CAMMA aims at developing new tools and methods based on machine learning and computer vision, in order to support the medical staff working in the operating room.

AR exposure of a surgeon

Mission: With an innovative team, contribute to the development and optimization of an application for the visualization of simulated X-rays in augmented reality. The purpose of this application is to raise awareness about the use of radiations in the operating room for in-situ safety teaching.

Based on the Microsoft HoloLens technology, this application puts you in the context of an intervention relying on radiations, simulated on GPU with the Monte Carlo method. The tests will be performed in a hybrid room equipped with RGBD cameras that will need to be registered with the HoloLens. The communication between the HoloLens and the system will use the protocol Wi-Fi.

Required profile: Currently a student in your last year of engineering school or master research specialized in computer science, you are looking for an end-of-study internship:

  • you are serious and motivated
  • you have skills in computer vision
  • you are strongly attracted by augmented reality
  • you are able to work in a team
  • you have good English skills, both written and spoken

Appreciated:

  • Experience with C++, C# and/or Unity
  • Experience in the development of an AR application, maybe even HoloLens

Duration: 5 to 6 months

Starting date: January-February 2020

Job types: Full-time, Internship

Website: http://camma.u-strasbg.fr/

Contact information:

cindy.rolland@unistra.fr

alexandre.krebs@unistra.fr

References:

N. Loy Rodas, J. Bert, D. Visvikis, M. de Mathelin, N. Padoy, Pose Optimization of a C-arm Imaging Device to Reduce Intraoperative Radiation Exposure of Staff and Patient during Interventional Procedures, IEEE International Conference on Robotics and Automation (ICRA), 2017

N. Loy Rodas, F. Barrera, N. Padoy, See It With Your Own Eyes: Marker-less Mobile Augmented Reality for Radiation Awareness in the Hybrid Room, IEEE Transactions on Biomedical Engineering (TBME), Volume: 64,  Issue: 2, Pages: 429 – 440, Feb. 2017 (online version), doi:10.1109/TBME.2016.2560761, 2016

N. Loy Rodas, N. Padoy, Seeing Is Believing: Increasing Intraoperative Awareness to Scattered Radiation in Interventional Procedures by Combining Augmented Reality, Monte Carlo Simulations and Wireless Dosimeters, International Journal of Computer Assisted Radiology and Surgery (IJCARS), MICCAI Special Issue, Volume 10, Number 8, pp. 1181-1191, 2015

Computer vision for robotic flexible endoscopy

pdf file for internship proposal

Title : Environment reconstruction using a monocular endoscopic camera

Keywords : visual tracking, shape from motion, depth recovery, medical robotics

Duration : approximately 5 months (ideally between february and august 2021)

Grant : legal grant for training periods (~ 550 euros / month).

Location : ICube Robotic platform, at IHU Strasbourg

Context : This internship takes place in the scope of the assistance to medical procedures with robotic flexible endoscopes.

The AVR team of the ICube laboratory has developed a robotic platform for endoluminal surgery called STRAS (see photo below). This is a telemanipulated system equipped with an endoscopic camera and two articulated instruments, with 3 degrees of freedom each. In addition to the conventional telemanipulation control, we aim at including automatic modes to the robot, with the aim to perform tasks such as automated scanning, or automatic endoscope positioning. For reaching this aim, one of the difficulties to be tackled is the reconstruction of the shape of the environment with the only available sensor: a monocular endoscopic camera.

Automatic task viewed from the endoscopic camera
STRAS robotic system

Problem to be solved In this project, we aim at reconstructing the shape and position of the environment (tissues in in vivo environment, phantoms in laboratory setups) with respect to the endoscopic camera. The camera being monocular, shape and structure from motion will be primarily used to reconstruct the environment and motions up to a scale factor. Shape from shading could also be envisioned. The difficulties are the low quality of endoscopic images, the limited possible lateral displacement of the endoscope and the possible interactions of the instruments with the tissues creating disturbing motions and deformations. In a second step, we will try to reconstruct the metric shape and positions. This can be done by using odometric measurements on the endoscope. However, these measurements are known to be imprecise. Specific strategies, will thus be needed to recover the unknown scale factor, by using for instance Bayesian filtering approaches or machine learning techniques.

Work to be carried out The intern will have to develop algorithms for shape reconstruction from monocular images by relying on state of the art methods for tissues tracking in endoscopy (gastroenterology in particular). Algorithms have already been implanted for pure tracking and can serve as a basis. Techniques for depth estimation will then be developed, by focusing on the use of embedded measurements provided by the robot encoders. If needed a second miniature camera could be added to the setup. Tests will be carried out in the laboratory on phantoms and on in vivo images acquired during previous preclinical trials.

Work environment The internship will take place on the medical robotic platform of the ICube laboratory located at IHU (Institut Hospitalo Universitaire) in the heart of Strasbourg. The intern will be supervised by Florent Nageotte (associate professor in medical robotics) and Philippe Zanne (Engineer, responsible for the STRAS robotic system). The intern will have access to a computer for developing programs, to image acquisition systems, to in vivo images and to the robotic device for laboratory testing. Developments will be made in C / C++ or Python and possibly with Matlab for prototyping.

Covid19 conditions: In case of sanitary constraints that may prevent the internship to be realized on site, a large part of the work could be done at a distance by working on data acquired off-line. Only robotic testing will be made impossible. The intern will have to work on his/her own laptop either developing and running algorithms locally or at a distance on a connected machine.

Candidates profile We are looking for Master students in the second year or students in engineering school at the level of Master 2, with major in computer vision or robotics / computer science with a strong interest / experience in computer vision. Interest in medical applications is a plus. Proficiency in C/C++ or Python coding is mandatory.

Conditions 5 to 6 months between February 2021 and August / September 2021. The intern will receive the legal “gratification” (around 550€ / month)

Application Interested candidates should send CV / resume, master program and grades (if available) and motivation letter to Nageotte@unistra.fr, by mentioning “computer vision internship” in the email subject.


Sujets en Vision par Ordinateur / Deep Learning (CAMMA: Computational Analysis and Modeling of Medical Activities)

We are looking for motivated and talented students with knowledge in computer vision and/or machine learning who can contribute to the development of our computer vision system for the operating room.

Please feel free to contact Nicolas Padoy if you are interested to do your master's thesis or an internship with us (funding of ~500Euros/month will be provided during 4 to 6 months). The successful candidates will be part of a dynamic and international research group hosted within the IRCAD institute at the University Hospital of Strasbourg. They will thereby have direct contact with clinicians, industrial partners and also have access to an exceptional research environment. The CAMMA project is supported by the laboratory of excellence CAMI, the IdEx Unistra and the MixSurg Institute.

Topics:

  • Deep Learning for Activity Recognition in Large Video Databases
  • Multi-view Human Body Tracking for the Operating Room using RGBD Cameras

More information about CAMMA

Links:




Modélisation expérimentale et pilotage d’un dispositif respiratoire

Publié le 5 mai 2021

Contexte

La respiration est connue pour avoir un impact sur le stress et le bien-être. Les techniques de respiration synchronisées, également intitulées cohérence cardiaque, consistent à ajuster sa respiration par rapport à une consigne, afin de permettre une baisse du stress et une amélioration du bien-être.

Objectifs du stage

Ce stage vise le développement de scénarios de respiration synchronisée qui soient adaptés aux usagers. Après un état de l’art sur le sujet, le travail pourra suivre les étapes suivantes :

  • création de scénarios de tests
  • expérimentation sur des volontaires en s’appuyant sur un dispositif d’immersion par réalité virtuelle et un capteur fournissant le rythme cardiaque et respiratoire
  • construction d’un modèle à partir des données expérimentales
  • synthèse d’une loi de pilotage établie à partir du modèle
  • évaluation de la loi de pilotage et comparaison avec d’autres techniques

Durée de stage: 3 à 5 mois

Cadre de travail

Le stage se déroulera sur le campus de Strasbourg-sud (à Illkirch) très bien desservi en tram (à 20 minutes du centre-ville) dans l'équipe Automatique-Vision-Robotique du laboratoire ICube. Vous serez en interaction avec les différents acteurs du projet et notamment l'entreprise HypnoVR qui pilote le projet.

Profil et candidature

Étudiant dans un Master scientifique ou en école d’ingénieur, vous disposer de compétences à la fois théoriques et expérimentales sur la commande et l’identification des systèmes dynamiques. Motivés pour vous investir dans un projet de recherche partenariale associant un laboratoire de recherche et une start-up, vos capacités de communication et votre sens de l'engagement seront des facteurs-clés de réussite.

Votre candidature, comprenant un CV et une lettre de motivation, sera adressée à laroche@unistra.fr.


Modélisation des mouvements de l'endoscope et de l'endoscopiste en endoscopie interventionnelle

Contexte

Le stage s'inscrit dans le cadre d'un projet entre le laboratoire ICube et l'IHU de Strasbourg sur la modélisation du geste endoscopique en endoscopie interventionnelle à fin de développer un outil interactif réaliste de simulation basé sur des mesures expérimentales pour la formation des praticiens.

Objectifs du stage

La mission consiste à traiter et analyser des données spatiales et temporelles (formes, positions et déplacements) provenant de l'endoscopie gastrique. Durant la gastroscopie diagnostique, la position de l'endoscopiste et la forme de l'endoscope sont enregistrés à l'aide de capteurs spécifiques. Différents prétraitements et analyses ont déjà été réalisés, le travail consistera à identifier dans les données des corrélations ou des motifs ayant un sens clinique et à les mettre en forme en vue d'une publication médicale.

Durée de stage: 3 à 6 mois

Cadre de travail

Le stage se déroulera sur le campus de l'Hôpital Civil au sein de l'IHU de Strasbourg à proximité de la Petite France en centre ville de Strasbourg dans l'équipe Automatique-Vision-Robotique du laboratoire ICube.

Le travail se fera en collaboration avec une praticienne hospitalière spécialisée en endoscopie digestive et un informaticien.

Profil et candidature

Actuellement en quatrième ou cinquième année d'une école d’ingénieurs ou d'un master, vous disposez d'une solide formation en science des données qui vous permettra de proposer et de mettre en oeuvre des méthodes de traitement des données expérimentales pour faire émerger une classification et caractérisation des gestes endoscopiques réalisés suivant l'expérience du praticien et la typologie des examens endoscopiques réalisés.

Une connaissance et une pratique de Matlab, Python ou d'un langage équivalent est requise. Une connaissance des méthodes statistiques et d'analyse de données est souhaitée.


Votre candidature, comprenant un CV et une lettre de motivation, sera adressée à demathelin@unistra.fr.