[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

GB2527514A - Ocular simulation tool - Google Patents

Ocular simulation tool Download PDF

Info

Publication number
GB2527514A
GB2527514A GB1411134.8A GB201411134A GB2527514A GB 2527514 A GB2527514 A GB 2527514A GB 201411134 A GB201411134 A GB 201411134A GB 2527514 A GB2527514 A GB 2527514A
Authority
GB
United Kingdom
Prior art keywords
user
movement
eye
ocular condition
ocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1411134.8A
Other versions
GB201411134D0 (en
Inventor
Luke Anderson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Swansea Medical Apps Ltd
Original Assignee
Swansea Medical Apps Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Swansea Medical Apps Ltd filed Critical Swansea Medical Apps Ltd
Priority to GB1411134.8A priority Critical patent/GB2527514A/en
Publication of GB201411134D0 publication Critical patent/GB201411134D0/en
Priority to PCT/GB2015/051811 priority patent/WO2015198023A1/en
Publication of GB2527514A publication Critical patent/GB2527514A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Medicinal Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Medical Informatics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention provides a medical training system comprising software arranged to reconfigure a virtual face 1 or portion thereof in response to a users movement in free space and in accordance with one or more symptoms relating to a predetermined ocular condition. The system further comprises motion sensing (2, figure 9) means arranged to detect the users movement made in free space. Therefore, the users gestures can be used to input commands via a hands-free input interface. The invention simulates one or more ocular conditions which are displayed on a screen. The user e.g. medical student is able to interact with the system via a variety of input means including movement in free space so as to produce a response on the virtual patients face, the response being symptomatic of one or more ocular conditions. For example, the eyes may track the users finger in a particular manner. The invention also provides a simulation tool for training users in respect of a corrective procedure, such as laser surgery.

Description

Ocular Simulation Tool This invendon relates generally to medical tools, and more particularly to computer simulation techniques for assisting in the training of medical students and practitioners.
The invention is particularly suited for use in teaching how to diagnose and/or treat eye-related conditions such as ocular motility problems, pupil abnormalities, lid abnormalities and neurological problems concei-n ig the eye(s) and face.
Ophthalmologists and neuro-ophthalmologists arc specialist physicians who arc trained to diagnose and treat patients with a variety of conditions which affect andlor arise from the eye. These could include ocular motility problems, pupil abnormalities, lid abnormalities and neurological problems. The symptoms of such conditions may he manifested in the patient's eye(s) and/or face, and the physician must be able to recognise the symptom(s), diagnose the cause, and reconmiend a corrective solution. The solution may take a variety of forms, including medication, physiotherapy or surgery. A variety of tests can be used in clinical piactice to assess the eye condition.
In practice, ti-aining usually consists of clinical exposure to patients who have eye conditions. This may be undertaken as part of a medical residency. where the student shadows a trained and qualified practitioner. The student observes and practices diagnoses of conditions, and the tests that can be used in clinical practice. The more patients the student is exposed to, and the greater the variety and spectrum of ocular conditions encountered, the faster the student will learn and the deeper the student's understanding will be.
However, such clinical exposure can only be achieved over a long penod of dmc, and during surgery hours when qualified doctors and patients are available. Therefore, it would be beneficial to provide an alternative approach to training which is not reliant upon the presence of a trained practitioner.
Several computer-based training arrangements are known in the art. One such known arrangement for teaching ocular conditions is the LUMA Vision Simulator' from Eyernaginations, availahle from Apple lnc's® app store. This provides a tool to allow doctors to explain eight common diseases of the eye. Disease progressions are showcased in up to eight anatomical views and four corresponding POV scenes. The interface allows users to view disease progressions and point-of-view scenes side-by-side or one-at-a-time.
Also, an on-screen (h-awing tool gives doctors the ability to annotate and markup images.
However, this does not simulate the patient's face as would be seen by a qualified medical practitioner during clinical practice, and therefore does not offer the student the opportunity to practice and learn in a life-like manner. It does not offer the student the chance to simulate the performance of tests and assessments which would be encountered in clinical practice.
The LUMA Optical system also provided by hyemaginations provides a patient education tool for use in reception areas and waiting rooms of eye care specialists' offices. However.
this is a passive system, which merely present information to the patient without interaction. It does not enable the viewer to control or influence the performance of a simulated ocular assessment or treatment.
The EMA Ophthalmology software system, developed by Modernizing Medicine, provides an Electronic Medical Records (EMR) system for use in ophthalmology practice. It enables the practitioner to take notes and maintain patient records. However, it does not assist in medical training, or perform any simulation of conditions or clinical tests.
The retinoscopy simulator from eyedocs.co.uk provides a computer-hased simulator which enables a user to replicate a ietinoscopy on a virtual eye displayed on the screen.
Retinoscopy is a technique used by practitioners to obtain an objective measurement of the refractive error of a patient's eyes. The examiner uses a retinoscope to shine light into the patient's eye and observes the reflection (reflex) off the patient's retina. While moving the streak or spot of light across the pupil the examiner observes the relative movement of the reflex then uses a phoropter or manually places lenses over the eye (using a trial frame and trial lenses) to "neutralize' the reflex. The retinoscopy simulator' displays a virtual streak or beam of light over the eye. which the user can then move using a mouse so as to observe the simulated patient's response. However, the position and movement of the light' is constrained by the input device used to control its movement, and 3 dimensional movement is not possible. The distance of the light source from the eye is hard wired' into the simulator and cannot be alternd during use by a user. In clinical practice, however, the practidoner would be able to move his hand and the light source in any direchon he choose relative to the patient's eye -up, down, towards or away relative to the eye.
Therefore, the known simulation approach cannot provide a truly realistic experience for a trainee.
Thus, it is desirable to provide a tool which can assist in the training of medical practitioners when learning how to recognise and/or treat a variety of eye-related problems.
Ideally, such a solution should be relatively cheap to supply. easy and intuitive to use, and accessible at any time. Such a solution may provide the student with exposure to a spectrum of ocular conditions, to enhance the student's clinical learning experience.
Ideally, the solution would provide a simulation-based training experience which allows a more realisdc interaction with the user than previously provided by the prior art.
Such an improved solution has now been devised.
Thus, in accordance with the present invention there is provided a computer-implemented solution as defined in the appended claims.
Therefore, in accordance with an aspect of the invention there is provided a medical training system comprising: software arranged to reconfigure a virtual face or portion thereof in response to a user's movement in free space and in accordance with one or more symptoms relating to a predetermined ocular condition.
The phrase movement in free space' may be interpreted as meaning gestures'. The movement may be motion in any direction. It may be physical motion made in air. It may he distinguished, therefore, from movement which is made via contact with an object eg in contrast to a finger moving across or on a touch screen. The free movement may be made without the aid of a pointing or computer input device such as a mouse, tracichall, haptic etc. The present invention enables the user to provide input to control the simulator's behaviour by moving his body or a part thereof in 3-dimensional space. without the use of a pointing or ftacking device such as a mouse. The user's detected movement may be in any direction or angle relative to a sensor device and/or a device on which the invention is installed for execution.
Beneficiafly. the system further comprises motion sensing means arranged to detect the user's movement in free space. The motion sensing means may be a motion sensing input device or apparatus. It may detect motion visually. It may comprise one or more 3-D depth sensors. The motion sensing means may be a sensing system comprising a camera and suitably arranged software. The sensing means may measure the usefs movement relative to a reference position such as the position of the sensing means. Data relating to the user's movement may be captured. Such data may relate to the speed, direction, plane of movement. The movement-related data may then be used to change what is shown on a screen. It may be used to reconfigure the virtual face or portion therefore (e.g. by causing the eye(s) to move in response to the user's movement) or may he used to reposition a virtual tool or piece of equipment, such as a scalpel, aiming beam or lens.
Thus, the motion sensing apparatus may enable the user to interact with the invention through a natural user interface using gestures andlor spoken commands. The system may also comprise voice recognition technology to facilitate the user's input via spoken commands. Thus, the invention may provide a hands-free interface for the input of user commands.
Beneficially. the system may thrther comprise a display device for displaying the virtual face or portion thereof. The display device may be a screen. It may be a touch screen.
The display device may be provided as part of a computer-based device such as a tablet computer, smart phone, laptop, PC etc. The user may he able to provide input to the software via one or more other input devices in addition to the motion sensing means. For example. the user may be able to provide input via a touch screen, andJor by clicking on something shown on the screen. Voice recognition may be used to receive the user's input.
The predetermined ocular condition may be selected from a plurality of predetermined ocular conditions. Thus, the system may be configured to simulate more than one ocular condition. The user may select which ocular condition is to be simulated. Additionally or alternatively, the software may select the ocular condition for simulation.
The training system may be described as a medical simulation tool, The system may be interactive in that it responds to input received from the user. The input may be the user's movement in free space. Thus, the user may control his interaction with the system by movement in free space. The training system may recreate (simulate) the clinical tests which medical practitioners use to assess, treat and/or diagnose ocular conditions in real practice. Such medical practitioners may be orthoptists. opticians. neuro-opthalmologists.
optometrists or any practitioner working in a clinical practice relating to eye conditions.
Assessment may include diagnosis of the condition, and/or may include an evaluation of the severity of the condition.
The user's movement in free space may serve as input into the system. The input may then be used to generate a response from the system. For example. the user may move his finger around in front of the system which then detects this niotion and alters the virtual face on the screen in response e.g. by moving the eyes, and in a manner consistent with the ocular condition. This response to the user's input may aid in diagnosing or assessing a condition; for example, the virtual patient may be unable to visually track the user's finger correctly, this being a symptom indicative of a particular eye disease or neurological problem.
The software may be arranged to control the response of the virtual face according to the predetennined ocular condition. Thus, the movement or presentation of the virtual face may he displayed such that it exhibits and mimics symptoms expected from a real patient having the predetermined condition.
The training system may simulate at least a portion of the face of a real (i.e. live) patient who has at least one ocular-related condition. The virtual face may be a graphical representation of a human face. It may include (virtual) eyes, nosc and/or mouth. Thc software may be configured to present the virtual face as part of a human head. Thc software may be configured to present the virtual face andlor head from a variety of perspectives or angles. The image of the head may be rotatable so that it can be viewed from a variety of angles.
The virtual face (or portion thereof) may be a graphical representation or an image. The representation niay depict a view of a human face. It may graphically replicate what a trained practitioner would see before him in practice. It may be an external view of a human face. It may he a 2-D or 3-D graphical representation. Thus, the tool may simulate the experience of a medical practitioner who is looking at a live patient being treated or assessed in clinical practice.
This is in contrast to soine prior art arrangerneiits which display only a representation of the internal, anatomical structure of the eye and/or face, Thus, the present invention more closely simulates the experience of assessing and/or treating a real patient in practice.
The user may be a student e.g. medical student or other medical practitioner who wishes to learn about the treatment andlor diagnosis of ocular-related conditions.
The software may he configured to execute on any kind of computing device, including a tablet computer. laptop, desktop computer, smartphone etc. The computing device may comprise a screen. The screen may be a touch screen. The virtual face or portion therefore may be presented on a screen associated with a computing device upon which the software is installed for execudon.
Herein, the term ocular condition' may he used to refer to any condition which pertains to or affects the eye in any way. For example, it could he an ocular motility problem, or a pupil abnormality, or a lid abnormality. or a neurological problem concerning the eye and face. It could be any condition which would fall within the medical field of ophthalmology, neuro-ophthalrnology, or optometry for example. The condition may he caused by an individual's ocular anatomy or physiology, or an eye disease.
The predetermined ocular condition may be selected from a plurality of pre-determined ocular conditions. The ocular conditions may be stored in computer memory. Thus, during a flaining exercise a particular ocular condition may be selected for presentation to the user using the virtual face.
The virtual face or portion thereof may be reconfigured during use such that the virtual face exhibits one or more symptoms known to relate to the predetermined ocular condition.
The user's task may be to diagnose (identify) and/or assess the condition by observing the symptom(s) displayed.
The behaviour symptomatic of the ocular condition may be demonstrated in a variety of ways. It may be manifested in a graphical representation -for example, an eye (of the virtual face) may he depicted as blood shot, or an eye lid may he shown as dropping.
Additionally or alternatively, the symptom may be shown in an animated form -for example, a virtual eye may not track a moving finger as expected in a healthy individual, or a virtual pupil may not respond as expected in a healthy eye. The symptom may be associated with one or both of the eyes of the virtual face. andJor another part of the virtual face e.g. the niouth or lips or cheeks.
The virtual face may he able to respond to input received from the user. In this sense, the system may described as being interactive. The user's input may enable the user to manipulate the virtual face. This input may be provided by movement in free space which is detected by the motion sensing means. Optionally, input may also be received by another input means eg via a keyboard. For example, pressing a key may cause the (virtual) eyes to blink, or may cause the virtual patient to look to the left, right, up or down.
Additionally or alternatively, the user's input may he received via a touch screen. For example. swiping or dragging the user's finger across the screen may cause the virtual eyes to move, thus simulating visual tracking of a physician's finger in real life.
The user may be able to input a diagnosis of the at least one pre-deterinined ocular condition. The user's input diagnosis may be received in a variety of ways. For example, the user may speak his diagnosis into a microphone, or may type it via a keyboard, or may select it on the screen using a pointing device such as a mouse.
The software may be arranged to determine whether the user has correctly assessed the predetermined ocular condition by comparing the user's inputted assessment with the pre-determined ocular condition.
One or more rules may be stored in association with the pre-detenñned ocular condition.
These may be stored in volatile andlor non-volatile memory associated with the device upon which the software is arranged to execute.
The one or more rules may specify how the virtual face is to be displayed and/or manipulated iii accotdance with the pre-determined ocular condition. The iules inay specify how the virtual face is to respond to the user's input in accordance with the predetermined condition, Thus, the rules may specify how the software is to reconfigure the virtual face hi response to the user's motion in accordance with behaviour symptomatic of the predetermined condition and the user's movement.
Thus, the invention may comprise one or more rule sets, each rule set specifying behaviour associated with a patticular ocular condition and how the face or eyes at-c to he reconfigured in response to the user's input.
The system may be arranged to show or hide the name or other content identifying the ocular condition upon instrucdon from the user. The system may be arranged so that the identity (eg name) of the ocular condition is communicated to the user. For example, it may he displayed on the screen for the user to see. It may also he arranged so that the identity of the ocular condition is not communicated to the user, so that the user has to determine which ocular condition is being demonstrated via the reconfiguration and/or behaviour of the virtual face. The system niay provide a mechanism eg button or icon via which the user can choose to have the condition communicated OF not communicated. Eg the name of the condition may be displayed or hidden by clicking on a button on a screen, or selecting an option from a menu. This allows the user to interact with the system in more than one mode. The user may choose to know the name of the condition and then observe the relevant symptoms with this Irnowledge in mind, or may choose to observe the symptoms and attempt to diagnose the condidon. The identity of the condition may then he revealed after the user has inputted a diagiiosis.
The software may be configured to move the eyes in a smooth motion or saccade movement. The user's input may be used to conhol whether smooth or saccade eye movement is shown.
The software may be arranged to reconfigure the virtual face or portion thereof in responsc to a simulated moving beam of light or item of medical equipment. For example. the user's motion may be used to determine the movement of a beam of light being shone into the virtua' patient's eye. The system may simulate the movement of a lens in front of a virtual eye. The response of the virtual eye may be conflolled by the user's (detected) modon.
The movement of the simulated beam of light or medical equipment may be influenced by the user's movement in free space.
The system may he arFanged to enable the user to simulate performance of a coFFective procedure. The colTective procedure may he laser surgery. It may he retiiioscopy.
The system may be arranged to provide a medical training tool. The system may comprise teaching modules or components alTanged to guide a student through a learning experience. The system may comprise stored content for use in assessing the student's ability to diagnose, assess and/or treat one or more ocular conditions The system may comprise meti-ics against which the studeiit's (user's) perfoi-rnance may he assessed. Data relating to the student's performance maybe stored for thture reference andlor sent to a destination via a network.
In accordance with another aspect of the invention there is provided a medical training system arranged to simulate facial and/or ocular behaviour associated with a variety of ocular conditions, the system comprising: a plurality of rule sets, each rule set specifying behaviour associated with a respective ocular condition; and softwaie arranged to reconfigui-e a computer-genei-ated representation of a face or portion thereof in accordance with at least one of the rule sets and in response to a user's input.
The user's input may be received via a motion sensing arrangement.
Features described above in relation to other aspects of the invention may also apply to this aspect of the invention.
Also in accordance with the hwention, there is provided a method corresponding to the above described system. Thus, there is provided a medical training method comprising the step: using software to reconfigure a virtual face or portion thereof in response to a user's movement in free space, and in accordance with one or more symptoms relating to a predetermined ocular condition.
The term virtual face' may be interpreted as meaning a simulation or image which represents a head or face.
The features described above may also be applicable in respect of the method.
The method may further comprise any or all of the following steps: * providing a display device for displaying the virtual face or portion thereof; * enabling a user to input a diagnosis of the predetermined ocular condition; * providing software arranged to determine whether the user has correctly diagnosed the ocular condition by comparing the user's inputted diagnosis with a stored record of the predetermi ned ocular conditi on; * storing one or more rules in association with the predetermined ocular condition; * wherein the one or more rules specify how the virtual face or portion thereof is to be reconfigured in accordance with the ocular condition.
* pi-oviding softwai-e arranged to display a variety of virtual faces or portions to the user; * enabling the user to perform a simulation of a clinical test for assessing and/or diagnosing the ocular condition; * showing or hiding the name or other content identifying the ocular condition upon instruction from the user; * arranging the software to reconfigure the virtual face or portion thereof in response to a simulated moving beam of light or item of medical equipment; * wherein the movement of the simulated heam of light or medical equipment is intluenced by the user's movement in free space; * enabling the user to simulate performance of a corrective procedure; * wherein the corrective procedure is laser surgei-y; and/or * arranging the software to provide a medical training tool.
Also in accordance with the invention there may be provided a medical training system arranged to enable a user to simulate the performance of a procedure perfortned in relation to an ocular condition, the system comprising: softwaie arranged to simulate the movement of an item of equipment conti-olled by a user, the movement corresponding to the user's movement in free space.
The system may further comprise a motion sensing means arranged to detect the user's gestures (movement made in free space).
Thus, it may enable the user to interact with the invention through a natural user interface using gestures and/or spoken commands The system may comprise voice recognition technology to facilitate the user's input via spoken commands. Thus, the invention provides a hands-free interface for the input of user commands.
The software may he arranged to display a variety of virtual faces to the user. Thus, the tool may be configured to simulate a variety of real' patients. This provides a more realistic and enhanced training experience. The different virtual faces may be displayed sequentially or concurrently.
The software may be arranged to enable the user to perform a simulation of a clinical test for assessing an ocular condition. Thus, the system may provide apparatus arranged to recreate a clinical test so that the user is able to pertbrm the actions which would be undertaken by a practitioner during assessment or treatment of an eye condition.
In the clinical assessment of certain ocular condidons. the physician would sit in front of the patient and move his finger in free space in front of the patient's eyes. The physician observes the response produced in the patient's eye(s) and uses this observation to assess the eye condition. The present invention may recreate or simulate such a test by manipulating the virtual face on the screen so that it responds to the movement of the user's finger (or other body part, or a pointiug device such as a stylus). The simulated response may be designed to recreate the response of an individual having the predetermined condition.
In certain clinical tests, the pupil nmy be assessed by using a light source (such as a torch) to illuminate the pupil. The software may be arranged to simulate the movement of a beani of light across the virtual face and/or eyes. The software may be arranged to simulate the movement of the beam of light in correspondence with the movement of the user's finger (or other body part) on or iii front of a screen on which the virtual face is displayed. Thus, the system may be arranged to interpret the position of the user's finger input as the position of a light source.
Thus, the invention can be said to provide an improved pedagogical aid for training medical practitioners in the art of assessing, diagnosing and/or treating ocular-related conditions. Its contribution may he said to lie within the (technical) field of optometry, ophthalmology and/or neuro-ophthalmology. Additionally or alternatively, it may be said to provide an improved medical/optometry simulation technique for recreating and modelling ocular conditions, syinptoms and treatments. By interacting with the user, the invention provides a means for controlling and guiding the user * s medical learning experience. The invention recreates and simulates known clinical tests used by medical practitioners to assess ocular conditions in real life practice.
Any of thc features described above in respect of one aspect or embodiment of the invention may also he applicable in iespect of any other embodiment(s).
These and other aspects of the present invention will be apparent from and elucidated with reference to, the embodiment described herein.
An embodiment of the present invention will now he described, hy way of example only, and with reference to the accompany drawings, in which: Figure 1 provides a screen shot of a medical fining tool in accordance with an embodiment of the invention, and showing a main menu which provides a user with a list of categories which the user can select from.
Figure 2 shows an accommodation target which tests convergence and divergence eye movements. The invention can be used to simulate testing of accommodative pupil response.
Figure 3 shows the virtual patient's head having been rotated by the user.
Figure 4 shows a main screen in accordance with an embodiment of the invention.
Figure 5 shows an occluder placed in front of the virtual patient's eye.
Figure 6 shows a cornea] reflection being simulated using an enihodinient of the invention.
Figure 7 shows the virtual patient with glasses.
Figure 8 illustrates how an embodiment of the invention could he used to assess a virtual patient's light response.
Figure 9 shows a motion sensing apparatus as known in the prior art and which may be incorporated into the present invention to detect andlor h-ack the user's movement so that it can be used as input to dhect the activities of the invention's softwai-e.
In use, the software of the present invention is installed and configured for execution on a compudng device. The computing devicc may be a portable, mobile or handheld device such as a tablet computer, or a smart phone or lap top, or it may be a desktop PC or any other type of computing device, The device has a screen upon which the virtual patient's head or face I is presented.
A motion sensing arrangement 2 is in communication with the computing device so that the user's free movement can be detected and used as input for conftolling the behaviour of the simulated face or eyes. The motion sensing apparatus 2 can take a variety of foniis and could, for example, be an Xbox Kinect® as shown in figure 9. The motion sensing appaiatus may he a wehcam-style add-on peripheral which enables users to control and interact with the computing device and the software on it without the need for a controller or physical input device such as a mouse or touch screen (although such input devices may also be used in some embodiments). It enables the user to interact with the invention through a natural user interface using gestures and spoken conunands. It may incorporate voice recognition technology to facilitate input via spoken commands. Thus, the invention provides a hands-free interface for the input of user commands.
The motion sensing apparatus 2 may comprise: * 3-I) depth sensors to track movement of the user's body (within fl-ce space) * A RGB camera to identify the user, take pictures and videos during use * One or more microphones to enable the user to provide speech input The invention generates a representation of a patient's face, head or portion thereof 1. In some embodiments, an external view of the patient's entire face is shown, from a front-facing perspective, as if the patient were sitting facing a practitioner in clinical practice.
The representation can be termed as a virtual patient' because the representation simulates a real patient's head/face 1 in a computer-implemented form. This enables the invention to be used to simulate the performance of an eye examination, test or procedure. The examination could include testing visual acuity, refraction, pupil function, ocular motility, visual field (confrontation) testing, external examination, slit lamp testing, or retinal examination.
The image of the head or face 1 may be rotated so as to change the perspective on the screen, as shown in figure 3 in which the head is shown rotated slightly to one side.
Rotadon of the head may be achieved by the user moving a finger across a touch screen.
In other embodiments, the user may control the orientation of the virtual headlface via the motion sensing apparatus 2, and/or by using some pointing device. The rotation of the head can be used to test oeulocephelic retlex, and to differentiate between inconiitant and conutant snbsimus types. The user may indicate that the head is to be rotated by selecting a rotate head' opti 011 3 provided on the screen -Various controls, buttons and menus are provided on the screen to allow the user to select options and control how the training session is to be conducted. These options may include an occluder option 4 which enables the user to simulate covering an eye with a paddle as shown in figures 5 and 7. Movement of the oceulder in front of the virtual face may be conolled by corresponding movement of the user's finger or hand which is detected by the motion sensing apparatus. Additionally or alternatively, in an embodiment which is touchscreen enabled the occiuder may he moved and manipulated by the user via finger contact on the screen. Alternatively or additionally, it may be moved by dragging it with the use of a mouse or other pointing device. The occluder simulation feature enables simulation of tests for manifest strabismus, latent strabisinus, and tests for single duction movements of an eye.
An illustrative main screen is shown in Figure 4. A single touch event (tap) on the touch screen generates saccade eye movement. Dragging a finger generates a smooth movement.
The near' button 5 option changes the fixation distance of the eye to near', thus altering the program's response to user input. The far' hutton 6 changes the fixation distance of the eye to fat', gain altering the software's response accordingly. Touch events can also gcnerate characteristic changes in the lids, dependent on the disease or condition being simulated.
The reset' button 8 can be used to return the face or head to the default orientadon and configuration as shown in figure 1.
A list of ocular conditions 13 can be providcd on the screen from which the user can choose. The could include normal'. esotropic', exotropic', infantile strabismus'.
heterophoric', alphabet patterns', neurogenic', myogenic'. mechnical', supronuclear and intronuclear'. By selecting one of the options, the user can alter the way in which the virtual patient will respond to the simulated eye test(s) such as the cornea retlcction shown in flizure 6. The software executes a set of behavioural rules associated with the selected condition. Examples of these rules are provided further below. These options can be hidden so that the user is iiot told which condition is being simulated, so that the user's task is to observe the response to the simulated test and recognize (diagnose) the condition.
Figure 6 shows a corneal retlection being simulated using an embodiment of the invention.
If the corneal retlection is not in the centre of the eye it can be determined that it is not the fixating eye.
Figure 7 shows the virtual patient wearing glasses. Putting glasses on the patient enables the user to test for accornniodative/ refractive element of strahismus. The user can add or remove glasses using the glasses' icon or button 7 on the screen.
Other assessment options are shown in Figures 6 and 8. In ocular physiology, adaptation' is the ability of the eye to adjust to various levels of darkness and light. An ambient light switch can be used to test the response of a patient's pupils to light, or direct light can be used to test pupil response. Figure 8 shows direct light being used to test pupil response.
This option can be selected using the direct light' button 11.
The invention provides the ability to simulate switching ambient light levels. In clinical practice, the practitioner would alter the light setting in the room within which the test is being performed. If the room lights are switched off, the pupils should enlarge.
Conversely, then should become smaller if the light is switched on. The invention simulates this by providing an ambient light' switch 10 which the user can use to change the ambient light conditions and observe the pupil response. The switch 10 has two settings -I for ambient light on' and 0 for ambient light off.
By selecting the torch (flashlight') option 9 displayed on the screen of the invention, the user is able to simulate shining a beam of light into the virtual patient's eye andlor onto the face, as shown in figure 6, This allows the user to simulate the performance of a cornea reflection test which is performed by shining a light in the person's eyes and observing where the light reflects off the corneas. In a person with normal ocular alignment the light reflex lies slightly nasal from the center of the cornea (approximately 11 prism diopters --or 0.5nm from the pupillary axis), as a result of the cornea acting as a temporally-turned convex mirror to the observer. When performing the test, the light reflexes of both eyes are compared, and will be symmetrical in an individual with normal fixation. For an abnormal result, based on where the light lands on the cornea, the practitioner can detect if there is an exotropia (abnormal eye is turned out), esotropia (abnormal eye is turned in). hypertropia (abnormal eye higher than the normal one) or hypotropia (abnormal eye is lower than the normal one).
Movement of the beam of light can he controlled in all directions by the user's gestures.
The motion sensing apparatus detects the position. angle and relative displacement of the user's hand and coordinates the movement of the simulated beam accordingly. An algorithm is used to calculate how the beam of light would be affected in response to the user's movement. Another algorithm then adapts the face/eyes on the display so that the virtual patient responds to the moving light. The response is tailored so that the virtual patient's response is indicative of a given condition.
This feature distinguishes the invention over the prior art because it enables a more realistic simulation to be produced. Prior art techniques hard wire' the distance of the light source or lens etc into their simulations, Turning to Figure 2, the patient is shown with an object 14 placed in front of the face to check the patient's accommodation. This test can be selected for performance via the accommodation option 12 on the screen. The accommodation reflex is a vision reflex that enables people to quickly transfer their focus between near and distant objects. It allows the eye to respond to an out of focus image very quickly. Patients with a slowed accommodation reflex may have a varicty of vision disorders. Three separate phenomena make up the accommodation reflex: convergence, changes to lens shape, and pupil constriction. Together, they allow the eye to adapt when it shifts between near and distant targets. When an object 14 is moved towards a patient with a normal accommodation the eyes will turn in (converge) and the pupils will become smaller, as shown in Figure 2.
In another embodiment, the invention provides a simulation tool which enables a user to simulate the performance of a clinical procedure. The procedure may be a form of ucartnent or assessment of an ocular condition, or a corrective procedure. This could be, for example, laser surgery. or a retinoscopy. In such an embodiment, the user is able to manipulate a virtual (simulated) surgical tool such as an aiming beam or a lens. The user's free hand motion is detected and is used to manipulate the virtual tool. The eye or portion of the face shown on the screen is adapted in accordance with the user's manipulation of the virtual tool. As 3 dimensional motion is detected by the invention, the user is able to manipulate the tool with a greater range of motions. For example, movement of the user's hand towards or away from the sensor can he used to simulate the application or withdrawal of pressure on the eye. By contrast, prior art arrangements which only provide for 2 directional input (e.g. by movement of a mouse or pointing device) cannot provide this enhanced experience.
The software of the invention provides methods and functionality which incorporate the rules which determine how the simulation will respond to the user's input. A plurality of rule sets are provided -each rule set relating to the behaviour associated with a particular ocular condition. The simulation's response is determined by the ocular condition which has been selected for simulation. Some illustrative functionality and rules which may be incorporated in an embodiment of the invention are shown below in order to allow the user to simulate different examinations (W tests.
Methods -Basic Movement -(co-ordinate eyes R and L) saccade: (co-ordinate) touch event Touching the screen causes both eyes to move to conesponding location at 400°sec, -(co-ordinate eyes R and L) smooth pursuit: (co-ordinate) drag event if (touch event maintained)1 Move right and left to eyes to conesponding co -ordinates.
-(co-ordinate eyes R and L) head tilt: (tilt°)accelerometer accelerometer tilt is cc to co ordinates R and L -(co-ordinate eyes R and L) Stop Movement: (0 Superior, I0 °MR, °inferior. °IR, °LR, °SR) limitations of gaze Eyes cannot rotate past alirnitation of gaze 30} Audio Output -turned OFF for eso/exo objects -(audio output) SpeakingStart: (°co-ordinate fixing eye)fixing eye (°eo-ordinate nonFixing eye)non Fixing eye if( difference °co-ordinate fixing eye and °co-ordinate nonFixing eye is greater than> 2° vertically or> 4°horizontally) output "I have double vision"}elsef I have no double vision -(audio output) SpeakingMoveWorse: (°co-ordinate fixing eye)fixing eye (°co-ordinate nonFixing eye)nonFixing eye (tilt°)aceeleronieter if( diffeience °co-ordinate fixing eye and °co-ordinate nonFixing eye is greater than> 2° vertically or >4°horizontally and on tilting the app this increase by another 4° I output "my double vision is worse" 5} -(audio output) SpeakingMoveBetter: (°co-ordinate fixing eye)fixing eye (°co-ordinate nonFwng eye)nonFixing eye (tilt°)aeeelcrometer if( difference °eo-ordinate fixing eye and °eo-ordinate nonFixing eye is greater than> 2° vertically or >4° horizontally and on tilting the app this resolves toO) I output "my double vision has gone" I Methods relating to Esoyhoria/ Exoyhoria: Cover test -(coordinate uncovered eye) Cover test: (Oecluder) oceluder if ( occluder == fixing eye( move uncovered eye to corresponding fixing eye locadon else { 110 movement * so if the firing eye is covered, the uncovered eye will take up the corresponding location.
If looking stiaight (ehead this is 0. If looking 30 outwards this is 30 degrees inwai-ds.
Alt Cover test -(movement eye) altCoverTest: (*Occluder) occluder (° vertical, °horizontal) Start position under oceluder { if ( oceluder held> 1.5 seconds + over alternating eyes >2 drnes) eye moves from start position under oceluder to co ordinates of fixadon) else iesult = Cover test} *tf looking straight ahead co ordinates of firation will be zero. If the tilt of tablet indicates looking up 30° this will be fixation.
Prsirn Cover test (Prisms are triangles with a base and a point. If there is no movement on covering the eye with the correct prism the eye will not move.) -(fixadon point non fixing eye) prsimCoverlest: (diopnes) prism bar if ( prism covers nonfixing or fixing eye) I movement of the nonfixhig eye to fixation = (normal movement -dioptrcs/2) 10} Non Eso/exo Class Methods The eye moves at the same speed hut due to overactions and limitations they stop at individual limitations of gaze.
move = linutations of gaze with no oceluder over either eye.
cover= lindtations of gaze for that eye when occluder is over the other eye alt = during alternating cover test the position the eye wifl take under the cover.
(If the tablet/computing device is tilted 30 degrees back. so looking up 10 degrees. if there is an 200% limitation in alt and a 100% limitation on move, once the alternating cover test is started. On removing the occluder the eye will move from 60 degrees to 30 degrees.) Cover test -(coordinate uncovered eye) Cover test: (*Occluder) occluder { if ( occluder == fixing eye{ move uncovered eye to corresponding fixing eye location else( no movement); * so if the fixing eye is covered, the uncovered eye will take tip the corresponding location.
If looking straight ahead this isO. If looking 30 outwards this is 30 degrees inwards.
Alt Cover test -(movement eye) altCoverTest: (tOccluder) occluder (° vertical, °horizontal) Start position under occluder if ( oecluder held> 1.5 seconds + over alternating eyes >2 times) eye moves from start position under occluder to coordinates of fixation} else( result = Cover testj if (tablet tilted to> normal limitations of gaze) the eye will move from the alt limitation of gaze to the cover limitation of gaze *Vloking straight ahead coordinates of fixation will be zero. lithe tilt of tablet/computing device indicates looking up 300 this will be fixation.
Extra Method Intorsion is the eye rotating inwards and extortion is outwards. Therefore the right eye will rotate eloekwisc when intorting and anticlockwisc whcn extorting.
-(movement eye) intort: fingerDrag if(eyemovedupandinby5degrees)( rotate eye s intort by 5 degrees.
I
I
-(movement eye) extort: tingerDrag if (eye moved downby 5 degrees) rotate eye s extort by S degrees.
I 20)
-(movement eye) TiltObliqueBadSamcSide: (tablet tilt same side) same side down 30 degrees increase start point move 10 degrees up and extort eye by 10 degrees eye same side of tilt add 10 degrees up undercover start and 10 extort When the patient's head is tilted to the right the eye starting point get higher = TiltObliqueBadSameSide When the patient's head is tilted to the left the right eyes starting point lowet = TiltObliqueBadOtherWay -(movement eye) TiltObliqueBadOtherWay: (tablet tilt) other side 10 degrees decrease start point under alt cover by 5 degrees.
if( start point>5 vertical)( decrease 5 degrees vertieally ) -(movement eye) upshoot: (drag event)
I
if the drag event moves the eye inwards horizontally it instead move to the limits of gaze for the inferior oblique, up and in.
-(nystagmus object) convergneeeRctraction: (touch event) eyc unable to rotate higher than 10 degrees on attempting up gaze Amp: 15hz: 16 -(nystagmus object) seeSaw all positions of gaze eyes intort 100 and rise 100 Then eyes extort 100 and drop 100 hz S Movement speed Abn -(co-ordinate eyes R and L) saccadeSlow: (co-oidinate) touch event Touching the screen causes both eyes to move to corresponding location at 40°sec.
-(co-ordinate eyes R and L) saccadeOvershoot: co-ordinate) touch event Touching the screen causes both eyes to move to 5 ° past corresponding location at 400°see. Before moving back to colTect place at same speed 30) -(co-ordinate eyes R and L) saccadeUnderShoot: (co-ordinate) touch event Touching the screen causes both eyes to move to 5° before corresponding location at 400°sec. Before moving to correct place at same speed -(co-ordinate eyes R and L) saccadeSlowUnderShoot: (co-ordinate) touch event Touching the screen causes both eyes to move half way to corresponding location at speed 40°sec. Eye stops 0.5 sec then starts again to conect position -(co-ordinate eyes R and L) saccadeGlissadeover: (co-ordinate) touch event 1 Touching the screen causes both eyes to move to overshoot colTesponding location by 5° at speed 400°sec. Eye stops then moves to correct location at speed 40°sec -(co-ordinate eyes R and L) cogwheel: (co-oidinate) diag event On drag event the eyes moves at a speed of 4000 sec for 100 then stop and wait for touch event to catch up before starting again.
Extra Methods Eso/Exo -(fixing eye) alternating: (co-ordinates nonfixing) nonfixing (co-ordinates fixing) fixing if (occluder== fixing eye) on removing occluder the previous fixing eye is now non fixing 1 -(fixing eye) crossFixation: (co-ordinates nonfixing) nonfixing (co-ordinates fixing) fixing if( non fixing eye moves >10 past midline outwai-ds) non fixing == fixing eye} -(Change smooth pursuit niethod) inferiorObliqucOveraction: (dragEvent) + 100 to limitation of gaze 10.
-(movement covered eye) DVD: (*oecludcr)occluder On removing oceluder after coverlest the eye will move back to start point at speed of 40°sec from a position 10° up and out.
-(movement covered eye) DHD: (toecludcr)occluder On removing oceluder after coverTest the eye will move back to start point at speed of 40°see from a position 100 out.
-(movement eye) altCoverlestDecompensating: (°Occluder) oceluder (° vertical, °horizontal) Start positi oii under occluder if ( oceluder held> 1.5 seconds + over alternating eyes >4 thnes) f eye moves from start position x2 under oceluder toO) elsef result = Cover test) -(method) internilttentRight: (coverTest) coverTest (altCoverTest) { if( aliCover undertaken x2) convert altCoverlest results to right cover test) -(method) internilttentSpontaneousRight: (coverTest) coverTest (aitCoverTest) On opening GUI convert from altCoverlest result to Right coverlest start points and left 0. then after 10 seconds convert hack starting point to 0.
Therefore, the invention provides the user with a more effective and realistic training tool
than prior art techniques,
It should he noted that the above-mentioned enihodirnents illustrate iather than limit the invention, and that those skilled in the art will he capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. In the claims, any reference signs placed in parentheses shall not be construed as limiting the claims. The word "comprisiflg" and "comprises', and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. In the present specificadon, "comprises" means "includes or consists of' and "comprising" means "including or consisthg of". The singular reference of an element does not exclude the plural reference of such elements and vice-versa. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably piograrnmed computer. Iii a device claim eiiumerating seveial means, several of these means may be embodied by one and the same item of hardware.
The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (21)

  1. CLAIMS: 1. A medical training system compnsing: software arranged to reconfigure a virtual face or portion thereof in response to a user's movement in free space and in accordance with one or more symptoms relating to a predetermined ocular condition.
  2. 2. A system according to claim 1 and further comprising: motion sensing means arranged to detect the user's movement made in free space.
  3. 3, A system according to claim 1 or 2 and further comprising: a display device for displaying the virtual face or portion thereof.
  4. 4. A system according to any preceding claim wherein the user is able to input a diagnosis of the predetermined ocular condition.
  5. 5. A system according to claim 4 wherein the software is arranged to determine whether the user has correctly diagnosed the ocular condition by comparing the user's inputted diagnosis with a stored record of the predetermined ocular condition.
  6. 6. A system according to any preceding claim and further comprising means for storing one or more rules in association with the predetermined ocular condition.
  7. 7. A system according to claim 6 wherein the one or more rules specify how the virtual face or portion thereof is to be reconfigured in accordance with the ocular condition.
  8. 8. A system according to any preceding claim wherein the software is arranged to display a variety of virtual faces or portions to the user.
  9. 9. A system according to any preceding claim wherein the system is arranged to enable the user to perform a simulation of a clinical test for assessing andlor diagnosing the ocular condition.
  10. 10. A system according to any preceding claim wherein the system is arranged to show or hide the name or other content identifying the ocular condition upon insuction from the user.S
  11. 11, A system according to any preceding claim wherein the software is arranged to reconfigure the virtual face or portion thereof iii response to a simulated movi ig beam of light or item of medical equipment.
  12. 12. A system according to claim 11 wherein the movement of the simulated beam of light or medical equipment is influenced by the user's movement in free space.
  13. 13. A system according to any preceding claim wherein the system is arranged to enable the user to simulate performance of a corrective procedure.
  14. 14. A system according to claim 13 wherein the corrective procedure is laser surgery.
  15. 15. A system according to any preceding claim wherein the system is arranged to provide a medical training tool.
  16. 16. A medical training method comprising the steps: reconfiguring a virtual face or portion thereof in response to a user's movement in free space, and in accordance with behaviour symptomatic of a predetermined ocular condition.
  17. 17. A medical training system arranged to enable a user to simulate the performance of a clinical procedure performed in relation to an ocular condition, the system comprising: software arranged to simulate the movement of an item of medical equipment controlled by a user, the movement corresponding to the user's movement in free space.
  18. 18. A medical training system according to claim 17 wherein the system further comprises a motion sensing means arranged to detect the user's movement made in free space.
  19. 19. A medical training system according to claims 17 or 18 wherein the procedure is laser surgery.
  20. 20. A medical training system according to claims 17 or 18 wherein the procedure is a retinoscopy.
  21. 21. A medical training system arranged to simulate facial andlor ocular behaviour associated with a variety of ocular conditions comprising: a plurality of rule sets, each rule set specifying behaviour associated with a respective ocular condition; software arranged to reconfigure a computer-generated representation of a face or portion thereof in accordance with at least one of the rule sets and in response to a user' s input 22, A system according to claim 21 wherein the user's input is received via a motion sensing arrangement.
GB1411134.8A 2014-06-23 2014-06-23 Ocular simulation tool Withdrawn GB2527514A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1411134.8A GB2527514A (en) 2014-06-23 2014-06-23 Ocular simulation tool
PCT/GB2015/051811 WO2015198023A1 (en) 2014-06-23 2015-06-23 Ocular simulation tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1411134.8A GB2527514A (en) 2014-06-23 2014-06-23 Ocular simulation tool

Publications (2)

Publication Number Publication Date
GB201411134D0 GB201411134D0 (en) 2014-08-06
GB2527514A true GB2527514A (en) 2015-12-30

Family

ID=51409984

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1411134.8A Withdrawn GB2527514A (en) 2014-06-23 2014-06-23 Ocular simulation tool

Country Status (2)

Country Link
GB (1) GB2527514A (en)
WO (1) WO2015198023A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106448399A (en) * 2016-08-31 2017-02-22 刘锦宏 Method for simulating minimally invasive surgeries based on augmented reality
CN106530880A (en) * 2016-08-31 2017-03-22 徐丽芳 Experiment simulation method based on virtual reality technology
WO2018118858A1 (en) 2016-12-19 2018-06-28 National Board Of Medical Examiners Medical training and performance assessment instruments, methods, and systems
CN109559573B (en) * 2019-01-07 2021-05-28 王登芹 Physical diagnosis teaching device for diagnostics teaching
KR102406472B1 (en) * 2020-07-31 2022-06-07 전남대학교산학협력단 Simulation system for educating cross-eye based on virtual reality

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110091856A1 (en) * 2008-06-11 2011-04-21 Vrmagic Gmbh Opthalmoscope simulator

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070207448A1 (en) * 2006-03-03 2007-09-06 The National Retina Institute Method and system for using simulation techniques in ophthalmic surgery training

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110091856A1 (en) * 2008-06-11 2011-04-21 Vrmagic Gmbh Opthalmoscope simulator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VRmagic, "eyesi by VRmagic Indirect Ophthalmoscope Simulator" [online], published 2013. *

Also Published As

Publication number Publication date
GB201411134D0 (en) 2014-08-06
WO2015198023A1 (en) 2015-12-30

Similar Documents

Publication Publication Date Title
US10083631B2 (en) System, method and computer program for training for ophthalmic examinations
US20240099575A1 (en) Systems and methods for vision assessment
KR102669685B1 (en) Light field processor system
Ong et al. Applications of extended reality in ophthalmology: systematic review
Khademi et al. Comparing “pick and place” task in spatial augmented reality versus non-immersive virtual reality for rehabilitation setting
US20220013228A1 (en) Split vision visual test
KR20160090065A (en) Rehabilitation system based on gaze tracking
CN104244859A (en) General micro-operation simulator
RU2634682C1 (en) Portable device for visual functions examination
WO2015198023A1 (en) Ocular simulation tool
US20210045628A1 (en) Methods, systems, and computer readable media for testing visual function using virtual mobility tests
Soto et al. AR stereoscopic 3D human eye examination app
Nguyen et al. An experimental training support framework for eye fundus examination skill development
CN114402378B (en) Surgical simulator system and method
US11337605B2 (en) Simulator for the evaluation of a concussion from signs displayed during a visual cranial nerve assessment
US20240180416A1 (en) System and method for providing visual field tests
JP2024531380A (en) EYE-MIMIC CAMERA-ASSISTED ROBOT FOR LIVE, VIRTUAL OR REMOTE EYE SURGERY TRAINING APPARATUS AND METHOD - Patent application
Uribe-Quevedo et al. Physical and physiological data for customizing immersive VR training
US20230293004A1 (en) Mixed reality methods and systems for efficient measurement of eye function
US20230404388A1 (en) Method and apparatus for measuring relative afferent pupillary defects
Lopez Off-the-shelf gaze interaction
Schott et al. Lean-Interaction: passive image manipulation in concurrent multitasking
Hvass et al. A preliminary exploration in the correlation of cybersickness and gaze direction in VR
Pezzei Visual and Oculomotoric Assessment with an Eye-Tracking Head-Mounted Display
Keyvanara Perceptual Manipulations for Hiding Image Transformations in Virtual Reality

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)