skip to content
Development and validation of stimulation strategies for the optogenetics Preview this item
ClosePreview this item
Checking...

Development and validation of stimulation strategies for the optogenetics

Author: Quentin SabatierRyad BenosmanStéphane Régnier, professeur d'université).Yves FrégnacLaurent PerrinetAll authors
Publisher: 2018.
Dissertation: Thèse de doctorat : Robotique : Sorbonne université : 2018.
Edition/Format:   Computer file : Document : Thesis/dissertation : English
Summary:
Un million et demi de personnes souffrent de la Rétinopathie Pigmentaire, une famille de maladies héréditaires entraînant une dégénérescence de la rétine. La maladie commence par la perte de la vision nocturne et du champ visuel périphérique et mène à une cécité totale. En raison de l'hétérogénéité des mutations génétiques responsables de la maladie, des solutions visant à compenser les
Rating:

(not yet rated) 0 with reviews - Be the first.

Subjects
More like this

Find a copy online

Links to this item

Find a copy in the library

&AllPage.SpinnerRetrieving; Finding libraries that hold this item...

Details

Genre/Form: Thèses et écrits académiques
Material Type: Document, Thesis/dissertation, Internet resource
Document Type: Internet Resource, Computer File
All Authors / Contributors: Quentin Sabatier; Ryad Benosman; Stéphane Régnier, professeur d'université).; Yves Frégnac; Laurent Perrinet; Serge Picaud; Sio-Hoï Ieng; Sorbonne université (Paris / 2018-....).; École doctorale Sciences mécaniques, acoustique, électronique et robotique de Paris.; Institut de la vision (Paris).
OCLC Number: 1148969631
Notes: Titre provenant de l'écran-titre.
Description: 1 online resource
Responsibility: Quentin Sabatier ; sous la direction de Ryad Benosman.

Abstract:

Un million et demi de personnes souffrent de la Rétinopathie Pigmentaire, une famille de maladies héréditaires entraînant une dégénérescence de la rétine. La maladie commence par la perte de la vision nocturne et du champ visuel périphérique et mène à une cécité totale. En raison de l'hétérogénéité des mutations génétiques responsables de la maladie, des solutions visant à compenser les symptômes de la maladie émergent. Ces prothèses rétiniennes comportent trois éléments : (i) une caméra filmant la scène devant le patient, habituellement montée sur une paire de lunettes et (ii) un dispositif de stimulation qui est capable de contrôler une partie de l'activité neuronale du patient et (iii) un processeur qui implémente la transformation entre le signal de sortie de la caméra et les commandes de stimulation. Le travail présenté dans cette thèse contribue au travail de GenSight Biologics pour développer une telle prothèse rétinienne. Le project combine deux technologies récentes, une caméra neuromorphique dans laquelle chaque pixel acquiert le signal d'une manière asynchrone, et une très haute résolution temporelle, et l'optogénétique qui permet de rendre les neurones ciblés photoexcitables. Mon travail s'étend sur l'ensemble de la chaîne de traitement du signal. Nous présentons tout d'abord un algorithme extrayant les fréquences spatiales de la vidéo à partir du flux de mesures asynchrones émises par la caméra. Ensuite, nous nous concentrons sur l'Interface Cerveau-Machine en développant un modèle de la transformation reliant le signal lumineux projeté par les lunettes et les trains de potentiels d'action déclenchés par les cellules ganglionnaires de la rétine du patient.

A million and a half people suffer from Retinitis Pigmentosa, a family of inherited diseases leading to degeneration of the retina. The disease begins with the loss of night vision and peripheral visual field and leads to total blindness. Due to the heterogeneity of the genetic mutations responsible for the disease, emerging solutions aim to compensate for the symptoms of the disease rather than curing it. These retinal prostheses have three elements: (i) a camera filming the scene in front of the patient, usually mounted on a pair of glasses, (ii) a stimulation device controlling a part of the neuronal activity of the patient and (iii) a processor that implements the transformation between the output signal of the camera and the stimulation commands. The work presented in this thesis contributes to the work of GenSight Biologics to develop such a retinal prosthesis. The project combines two recent technologies, a neuromorphic camera in which each pixel acquires the signal in an asynchronous manner, and a very high temporal resolution, and optogenetics which makes the targeted neurons photoexcitable. My work spans the entire chain of signal processing. We first present an algorithm extracting the spatial frequencies of the video from the asynchronous measurement stream emitted by the camera. Next, we focus on the Brain-Machine Interface by developing a model of the transformation linking the projected light signal and the trains of action potential triggered by the patient's retinal ganglion cells.

Reviews

User-contributed reviews
Retrieving GoodReads reviews...
Retrieving DOGObooks reviews...

Tags

Be the first.
Confirm this request

You may have already requested this item. Please select Ok if you would like to proceed with this request anyway.

Linked Data


\n\n

Primary Entity<\/h3>\n
<http:\/\/www.worldcat.org\/oclc\/1148969631<\/a>> # Development and validation of stimulation strategies for the optogenetics<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:MediaObject<\/a>, bgn:ComputerFile<\/a>, schema:CreativeWork<\/a>, bgn:Thesis<\/a> ;\u00A0\u00A0\u00A0\nbgn:inSupportOf<\/a> \"Th\u00E8se de doctorat : Robotique : Sorbonne universit\u00E9 : 2018.<\/span>\" ;\u00A0\u00A0\u00A0\nlibrary:oclcnum<\/a> \"1148969631<\/span>\" ;\u00A0\u00A0\u00A0\nlibrary:placeOfPublication<\/a> <http:\/\/id.loc.gov\/vocabulary\/countries\/fr<\/a>> ;\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/optogenetique<\/a>> ; # Optog\u00E9n\u00E9tique<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/modelisation<\/a>> ; # Mod\u00E9lisation<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/neurostimulation<\/a>> ; # Neurostimulation<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/camera_neuromorphique<\/a>> ; # Cam\u00E9ra neuromorphique<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/vision_artificielle_robotique<\/a>> ; # Vision artificielle (robotique)<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/dewey.info\/class\/612.81\/<\/a>> ;\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/retinite_pigmentaire_aspect_genetique<\/a>> ; # R\u00E9tinite pigmentaire--Aspect g\u00E9n\u00E9tique<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/programmation_evenementielle<\/a>> ; # Programmation \u00E9v\u00E9nementielle<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/protheses_retiniennes<\/a>> ; # Proth\u00E8ses r\u00E9tiniennes<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/photorecepteurs_cellules_de_la_retine<\/a>> ; # Photor\u00E9cepteurs (cellules) de la r\u00E9tine<\/span>\n\u00A0\u00A0\u00A0\nschema:about<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/interface_cerveau_machine<\/a>> ; # Interface cerveau-machine<\/span>\n\u00A0\u00A0\u00A0\nschema:author<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/sabatier_quentin_1988<\/a>> ; # Quentin Sabatier<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Organization\/ecole_doctorale_sciences_mecaniques_acoustique_electronique_et_robotique_de_paris<\/a>> ; # \u00C9cole doctorale Sciences m\u00E9caniques, acoustique, \u00E9lectronique et robotique de Paris.<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/fregnac_yves<\/a>> ; # Yves Fr\u00E9gnac<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/ieng_sio_hoi_1976<\/a>> ; # Sio-Ho\u00EF Ieng<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Organization\/institut_de_la_vision_paris<\/a>> ; # Institut de la vision (Paris).<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/perrinet_laurent<\/a>> ; # Laurent Perrinet<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/picaud_serge<\/a>> ; # Serge Picaud<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/benosman_ryad<\/a>> ; # Ryad Benosman<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/regnier_stephane_1968_professeur_d_universite<\/a>> ; # professeur d\'universit\u00E9). St\u00E9phane R\u00E9gnier<\/span>\n\u00A0\u00A0\u00A0\nschema:contributor<\/a> <http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Organization\/sorbonne_universite_paris_2018<\/a>> ; # Sorbonne universit\u00E9 (Paris \/ 2018-....).<\/span>\n\u00A0\u00A0\u00A0\nschema:datePublished<\/a> \"2018<\/span>\" ;\u00A0\u00A0\u00A0\nschema:description<\/a> \"A million and a half people suffer from Retinitis Pigmentosa, a family of inherited diseases leading to degeneration of the retina. The disease begins with the loss of night vision and peripheral visual field and leads to total blindness. Due to the heterogeneity of the genetic mutations responsible for the disease, emerging solutions aim to compensate for the symptoms of the disease rather than curing it. These retinal prostheses have three elements: (i) a camera filming the scene in front of the patient, usually mounted on a pair of glasses, (ii) a stimulation device controlling a part of the neuronal activity of the patient and (iii) a processor that implements the transformation between the output signal of the camera and the stimulation commands. The work presented in this thesis contributes to the work of GenSight Biologics to develop such a retinal prosthesis. The project combines two recent technologies, a neuromorphic camera in which each pixel acquires the signal in an asynchronous manner, and a very high temporal resolution, and optogenetics which makes the targeted neurons photoexcitable. My work spans the entire chain of signal processing. We first present an algorithm extracting the spatial frequencies of the video from the asynchronous measurement stream emitted by the camera. Next, we focus on the Brain-Machine Interface by developing a model of the transformation linking the projected light signal and the trains of action potential triggered by the patient\'s retinal ganglion cells.<\/span>\" ;\u00A0\u00A0\u00A0\nschema:description<\/a> \"Un million et demi de personnes souffrent de la R\u00E9tinopathie Pigmentaire, une famille de maladies h\u00E9r\u00E9ditaires entra\u00EEnant une d\u00E9g\u00E9n\u00E9rescence de la r\u00E9tine. La maladie commence par la perte de la vision nocturne et du champ visuel p\u00E9riph\u00E9rique et m\u00E8ne \u00E0 une c\u00E9cit\u00E9 totale. En raison de l\'h\u00E9t\u00E9rog\u00E9n\u00E9it\u00E9 des mutations g\u00E9n\u00E9tiques responsables de la maladie, des solutions visant \u00E0 compenser les sympt\u00F4mes de la maladie \u00E9mergent. Ces proth\u00E8ses r\u00E9tiniennes comportent trois \u00E9l\u00E9ments : (i) une cam\u00E9ra filmant la sc\u00E8ne devant le patient, habituellement mont\u00E9e sur une paire de lunettes et (ii) un dispositif de stimulation qui est capable de contr\u00F4ler une partie de l\'activit\u00E9 neuronale du patient et (iii) un processeur qui impl\u00E9mente la transformation entre le signal de sortie de la cam\u00E9ra et les commandes de stimulation. Le travail pr\u00E9sent\u00E9 dans cette th\u00E8se contribue au travail de GenSight Biologics pour d\u00E9velopper une telle proth\u00E8se r\u00E9tinienne. Le project combine deux technologies r\u00E9centes, une cam\u00E9ra neuromorphique dans laquelle chaque pixel acquiert le signal d\'une mani\u00E8re asynchrone, et une tr\u00E8s haute r\u00E9solution temporelle, et l\'optog\u00E9n\u00E9tique qui permet de rendre les neurones cibl\u00E9s photoexcitables. Mon travail s\'\u00E9tend sur l\'ensemble de la cha\u00EEne de traitement du signal. Nous pr\u00E9sentons tout d\'abord un algorithme extrayant les fr\u00E9quences spatiales de la vid\u00E9o \u00E0 partir du flux de mesures asynchrones \u00E9mises par la cam\u00E9ra. Ensuite, nous nous concentrons sur l\'Interface Cerveau-Machine en d\u00E9veloppant un mod\u00E8le de la transformation reliant le signal lumineux projet\u00E9 par les lunettes et les trains de potentiels d\'action d\u00E9clench\u00E9s par les cellules ganglionnaires de la r\u00E9tine du patient.<\/span>\" ;\u00A0\u00A0\u00A0\nschema:exampleOfWork<\/a> <http:\/\/worldcat.org\/entity\/work\/id\/10132102142<\/a>> ;\u00A0\u00A0\u00A0\nschema:genre<\/a> \"Th\u00E8ses et \u00E9crits acad\u00E9miques<\/span>\" ;\u00A0\u00A0\u00A0\nschema:inLanguage<\/a> \"en<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Development and validation of stimulation strategies for the optogenetics<\/span>\" ;\u00A0\u00A0\u00A0\nschema:productID<\/a> \"1148969631<\/span>\" ;\u00A0\u00A0\u00A0\nschema:url<\/a> <http:\/\/www.theses.fr\/2018SORUS083\/abes<\/a>> ;\u00A0\u00A0\u00A0\nschema:url<\/a> <https:\/\/tel.archives-ouvertes.fr\/tel-02498147<\/a>> ;\u00A0\u00A0\u00A0\nschema:url<\/a> <http:\/\/www.theses.fr\/2018SORUS083\/document<\/a>> ;\u00A0\u00A0\u00A0\nwdrs:describedby<\/a> <http:\/\/www.worldcat.org\/title\/-\/oclc\/1148969631<\/a>> ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n\n

Related Entities<\/h3>\n
<http:\/\/dewey.info\/class\/612.81\/<\/a>>\u00A0\u00A0\u00A0\u00A0a \nschema:Intangible<\/a> ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Organization\/ecole_doctorale_sciences_mecaniques_acoustique_electronique_et_robotique_de_paris<\/a>> # \u00C9cole doctorale Sciences m\u00E9caniques, acoustique, \u00E9lectronique et robotique de Paris.<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Organization<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"\u00C9cole doctorale Sciences m\u00E9caniques, acoustique, \u00E9lectronique et robotique de Paris.<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Organization\/institut_de_la_vision_paris<\/a>> # Institut de la vision (Paris).<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Organization<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Institut de la vision (Paris).<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Organization\/sorbonne_universite_paris_2018<\/a>> # Sorbonne universit\u00E9 (Paris \/ 2018-....).<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Organization<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Sorbonne universit\u00E9 (Paris \/ 2018-....).<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/benosman_ryad<\/a>> # Ryad Benosman<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"Benosman<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"Ryad<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Ryad Benosman<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/fregnac_yves<\/a>> # Yves Fr\u00E9gnac<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"Fr\u00E9gnac<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"Yves<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Yves Fr\u00E9gnac<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/ieng_sio_hoi_1976<\/a>> # Sio-Ho\u00EF Ieng<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:birthDate<\/a> \"1976<\/span>\" ;\u00A0\u00A0\u00A0\nschema:deathDate<\/a> \"\" ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"Ieng<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"Sio-Ho\u00EF<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Sio-Ho\u00EF Ieng<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/perrinet_laurent<\/a>> # Laurent Perrinet<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"Perrinet<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"Laurent<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Laurent Perrinet<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/picaud_serge<\/a>> # Serge Picaud<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"Picaud<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"Serge<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Serge Picaud<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/regnier_stephane_1968_professeur_d_universite<\/a>> # professeur d\'universit\u00E9). St\u00E9phane R\u00E9gnier<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:birthDate<\/a> \"1968<\/span>\" ;\u00A0\u00A0\u00A0\nschema:deathDate<\/a> \";<\/span>\" ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"R\u00E9gnier<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"St\u00E9phane<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"professeur d\'universit\u00E9). St\u00E9phane R\u00E9gnier<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Person\/sabatier_quentin_1988<\/a>> # Quentin Sabatier<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Person<\/a> ;\u00A0\u00A0\u00A0\nschema:birthDate<\/a> \"1988<\/span>\" ;\u00A0\u00A0\u00A0\nschema:deathDate<\/a> \"\" ;\u00A0\u00A0\u00A0\nschema:familyName<\/a> \"Sabatier<\/span>\" ;\u00A0\u00A0\u00A0\nschema:givenName<\/a> \"Quentin<\/span>\" ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Quentin Sabatier<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/camera_neuromorphique<\/a>> # Cam\u00E9ra neuromorphique<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Cam\u00E9ra neuromorphique<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/interface_cerveau_machine<\/a>> # Interface cerveau-machine<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Interface cerveau-machine<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/modelisation<\/a>> # Mod\u00E9lisation<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Mod\u00E9lisation<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/optogenetique<\/a>> # Optog\u00E9n\u00E9tique<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Optog\u00E9n\u00E9tique<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/programmation_evenementielle<\/a>> # Programmation \u00E9v\u00E9nementielle<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Programmation \u00E9v\u00E9nementielle<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Thing\/protheses_retiniennes<\/a>> # Proth\u00E8ses r\u00E9tiniennes<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Thing<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Proth\u00E8ses r\u00E9tiniennes<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/neurostimulation<\/a>> # Neurostimulation<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Intangible<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Neurostimulation<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/photorecepteurs_cellules_de_la_retine<\/a>> # Photor\u00E9cepteurs (cellules) de la r\u00E9tine<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Intangible<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Photor\u00E9cepteurs (cellules) de la r\u00E9tine<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/retinite_pigmentaire_aspect_genetique<\/a>> # R\u00E9tinite pigmentaire--Aspect g\u00E9n\u00E9tique<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Intangible<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"R\u00E9tinite pigmentaire--Aspect g\u00E9n\u00E9tique<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/experiment.worldcat.org\/entity\/work\/data\/10132102142#Topic\/vision_artificielle_robotique<\/a>> # Vision artificielle (robotique)<\/span>\n\u00A0\u00A0\u00A0\u00A0a \nschema:Intangible<\/a> ;\u00A0\u00A0\u00A0\nschema:name<\/a> \"Vision artificielle (robotique)<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/id.loc.gov\/vocabulary\/countries\/fr<\/a>>\u00A0\u00A0\u00A0\u00A0a \nschema:Place<\/a> ;\u00A0\u00A0\u00A0\ndcterms:identifier<\/a> \"fr<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n
<http:\/\/www.theses.fr\/2018SORUS083\/document<\/a>>\u00A0\u00A0\u00A0\nrdfs:comment<\/a> \"Acc\u00E8s au texte int\u00E9gral<\/span>\" ;\u00A0\u00A0\u00A0\u00A0.\n\n\n<\/div>\n