[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN117441212A - Visualizing a mixture directly with rendering elements to display the mixture elements and actions occurring on and off screen - Google Patents

Visualizing a mixture directly with rendering elements to display the mixture elements and actions occurring on and off screen Download PDF

Info

Publication number
CN117441212A
CN117441212A CN202280040630.4A CN202280040630A CN117441212A CN 117441212 A CN117441212 A CN 117441212A CN 202280040630 A CN202280040630 A CN 202280040630A CN 117441212 A CN117441212 A CN 117441212A
Authority
CN
China
Prior art keywords
surgical
tracking
display
hub
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280040630.4A
Other languages
Chinese (zh)
Inventor
F·E·谢尔顿四世
S·R·亚当斯
M·D·考珀思韦特
C·G·金巴尔
M·L·Z·里瓦德
L·N·罗索尼
R·科杰塞夫
F·J·博克
K·M·费比格
J·L·哈里斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cilag GmbH International
Original Assignee
Cilag GmbH International
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/688,653 external-priority patent/US20220331013A1/en
Application filed by Cilag GmbH International filed Critical Cilag GmbH International
Priority claimed from PCT/IB2022/053370 external-priority patent/WO2022219498A1/en
Publication of CN117441212A publication Critical patent/CN117441212A/en
Pending legal-status Critical Current

Links

Landscapes

  • Surgical Instruments (AREA)

Abstract

Disclosed herein are devices, systems, and methods for mixed reality visualization. In one aspect, a method for mixed reality visualization includes: capturing, by a first camera of a first visualization system, an image of an object in a surgical field of view, wherein a first portion of the object is located outside the field of view of the first camera; tracking, by a tracking system, a position of a second portion of the object; determining, by a surgical hub, a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and displaying, by an augmented reality display device, the captured image of the object in the surgical field and a graphic based on the attribute of the object.

Description

Visualizing a mixture directly with rendering elements to display the mixture elements and actions occurring on and off screen
Cross Reference to Related Applications
The present application claims the benefit of U.S. 4, 14, 2021, U.S. provisional patent application No. 63/174,674 entitled "head UP DISPLAY" and U.S. provisional patent application No. 63/284,326 entitled "INTRAOPERATIVE DISPLAY FOR SURGICAL SYSTEMS", 2021, 11, 30, each of which is incorporated herein by reference in its entirety.
Background
The present disclosure relates to devices, systems, and methods for providing an augmented reality interactive experience during a surgical procedure. During a surgical procedure, it is desirable to provide an augmented reality interactive experience of a real-world environment in which objects residing in the real world are enhanced by superimposing computer-generated sensory information (sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory). In the context of the present disclosure, the surgical field and the images of surgical instruments and other objects present in the surgical field are enhanced by superimposing computer-generated visual, auditory, tactile, somatosensory, olfactory, or other sensory information over the surgical field and the real world images of instruments or other objects present in the surgical field. The image may be streamed in real-time, or may be a still image.
Real world surgical instruments include a variety of surgical devices including energy, staplers, or a combination of energy and staplers. Energy-based medical devices include, but are not limited to, radio Frequency (RF) based monopolar and bipolar electrosurgical instruments, ultrasonic surgical instruments, combined RF electrosurgical and ultrasonic instruments, combined RF electrosurgical and mechanical staplers, and the like. Surgical stapler devices are surgical instruments used to cut and staple tissue in a variety of surgical procedures including weight loss, breast, colorectal, gynecological, urological, and general surgery.
Disclosure of Invention
In various aspects, the present disclosure provides a method for mixed reality visualization of a surgical system. In some aspects, the method comprises: capturing, by a first camera of a first visualization system, an image of an object in a surgical field of view, wherein a first portion of the object is located outside the field of view of the first camera; tracking, by a tracking system, a position of a second portion of the object; determining, by a surgical hub, a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and displaying, by an augmented reality display device, the captured image of the object in the surgical field and a graphic based on the attribute of the object. In one aspect, the object comprises a surgical instrument, patient tissue, or a user, or a combination thereof.
In various aspects, the present disclosure provides a surgical system for mixed reality visualization. In some aspects, the system includes a first visualization system including a first camera configured to capture an image of an object in a surgical field of view, wherein a first portion of the object is located outside of the field of view of the camera; a first tracking system configured to be able to track a position of a second portion of the object; a surgical hub configured to determine a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and an augmented reality display device configured to display a captured image of the object in the surgical field and a graphic based on the attribute of the object. In one aspect, the object comprises a surgical instrument, patient tissue, a user, or a combination thereof.
Drawings
The various aspects (both as to the surgical organization and method) described herein, together with further objects and advantages thereof, may best be understood by reference to the following description taken in connection with the accompanying drawings.
Fig. 1 is a block diagram of a computer-implemented interactive surgical system according to one aspect of the present disclosure.
Fig. 2 is a surgical system for performing a surgical procedure in an operating room according to one aspect of the present disclosure.
Fig. 3 is a surgical hub paired with a visualization system, robotic system, and intelligent instrument, according to one aspect of the present disclosure.
Fig. 4 illustrates a surgical data network including a modular communication hub configured to enable connection of modular devices located in one or more operating rooms of a medical facility or any room specially equipped for surgical procedures in the medical facility to a cloud, according to one aspect of the present disclosure.
Fig. 5 illustrates a computer-implemented interactive surgical system in accordance with an aspect of the present disclosure.
Fig. 6 illustrates a surgical hub including a plurality of modules coupled to a modular control tower according to one aspect of the present disclosure.
Fig. 7 illustrates an Augmented Reality (AR) system including an intermediate signal combiner positioned in a communication path between an imaging module and a surgical hub display, according to one aspect of the present disclosure.
Fig. 8 illustrates an Augmented Reality (AR) system including an intermediate signal combiner positioned in a communication path between an imaging module and a surgical hub display, according to one aspect of the present disclosure.
Fig. 9 illustrates an Augmented Reality (AR) device worn by a surgeon to transmit data to a surgical hub, according to one aspect of the present disclosure.
Fig. 10 illustrates a system for augmenting surgical instrument information using an augmented reality display according to one aspect of the present disclosure.
Fig. 11 illustrates a timeline of a situational awareness surgical procedure in accordance with an aspect of the present disclosure.
Fig. 12 illustrates a surgical system including a tracking system configured to track objects within an operating room in accordance with an aspect of the present disclosure.
Fig. 13 shows a schematic side view of an exemplary implementation of the tracking system of fig. 12 in an operating room, in accordance with an aspect of the present disclosure.
Fig. 14 illustrates a schematic plan view of an exemplary operating room map generated by an operating room mapping module in accordance with an aspect of the disclosure.
FIG. 15 is a table of exemplary tracked object interactions determined by a surgical hub based on tracking system generated data in accordance with an aspect of the present disclosure.
Fig. 16A and 16B illustrate exemplary intraoperative displays including images of surgical instruments in a surgical field of view and graphics representing a portion of a surgical instrument outside the field of view in accordance with one aspect of the present disclosure.
17A and 17B illustrate exemplary intraoperative displays including images of stomach tissue when a surgeon forms a cut line in stomach tissue using an endocutter in accordance with one aspect of the present disclosure.
Fig. 18 illustrates a method for mixed reality visualization of a surgical system in accordance with an aspect of the present disclosure.
Corresponding reference characters indicate corresponding parts throughout the several views. The exemplifications set out herein illustrate various disclosed embodiments, in one form, and such exemplifications are not to be construed as limiting the scope of the invention in any manner.
Detailed Description
The applicant of the present application owns the following U.S. patent applications filed concurrently herewith, the disclosure of each of these patent applications being incorporated herein by reference in its entirety:
U.S. patent application entitled "METHOD FOR INTRAOPERATIVE DISPLAY FORSURGICAL SYSTEMS", attorney docket END9352USNP 1/210120-1M;
U.S. patent application entitled "UTILIZATION OF SURGICAL DATA VALUESAND SITUATIONAL AWARENESS TO CONTROL THEOVERLAY IN SURGICAL FIELDVIEW",
agent case END9352USNP 2/210120-2;
U.S. patent application entitled "SELECTIVE AND ADJUSTABLE MIXED REALITYOVERLAY IN SURGICAL FIELDVIEW", attorney docket END9352USNP 3/210120-3;
U.S. patent application entitled "RISK BASED PRIORITIZATION OF DISPLAYASPECTS IN SURGICAL FIELDVIEW", attorney docket END9352USNP 4/210120-4;
U.S. patent application entitled "SYSTEMS AND METHODS FOR CONTROLLINGSURGICAL DATA OVERLAY", attorney docket END9352USNP 5/210120-5;
U.S. patent application entitled "SYSTEMS AND METHODS FOR CHANGINGDISPLAY OVERLAY OF SURGICAL FIELDVIEW BASEDON TRIGGERING EVENTS", attorney docket END9352USNP 6/210120-6;
U.S. patent application entitled "CUSTOMIZATION OF OVERLAID DATA ANDCONFIGURATION", attorney docket END9352USNP 7/210120-7;
U.S. patent application entitled "INDICATION OF THE COUPLE PAIR OF REMOTECONTROLS WITH REMOTE DEVICES FUNCTIONS", attorney docket END9352USNP 8/210120-8;
U.S. patent application Ser. No. COOPERATIVE OVERLAYS OF INTERACTINGINSTRUMENTS WHICH RESULT IN BOTH OVERLAYSBEING EFFECTED, attorney docket END9352USNP 9/210120-9;
U.S. patent application entitled "ANTICIPATION OF INTERACTIVE UTILIZATIONOF COMMON DATA OVERLAYS BY DIFFERENT USERS", attorney docket END9352USNP 10/210120-10;
U.S. patent application Ser. No. SYSTEM AND METHOD FOR TRACKING A PORTION OF THE USER AS A PROXY FOR NON-MONITORED INSTRUMENT, attorney docket END9352USNP 12/210120-12;
U.S. patent application entitled "UTILIZING CONTEXTUAL PARAMETERS OF ONE OR MORE SURGICAL DEVICES TO PREDICT A FREQUENCY INTERVAL FOR DISPLAYING SURGICAL INFORMATION", attorney docket END9352USNP 13/210120-13;
U.S. patent application Ser. No. COOPERATION AMONG MULTIPLE DISPLAY SYSTEMS TO provider A HEALTHCARE USER CUSTOMIZED INFORMATION, attorney docket No. END9352USNP 14/210120-14;
U.S. patent application entitled "INTRAOPERATIVE DISPLAY FOR SURGICAL SYSTEMS", attorney docket END9352USNP 15/210120-15;
U.S. patent application entitled "ADAPTATION AND ADJUSTABILITY OR OVERLAID INSTRUMENT INFORMATION FOR SURGICAL SYSTEMS", attorney docket END9352USNP 16/210120-16; and
U.S. patent application Ser. No. MIXED REALITY FEEDBACK SYSTEMS THAT COOPERATE TO INCREASE EFFICIENT PERCEPTION OF COMPLEX DATA FEEDS, attorney docket END9352USNP 17/210120-17.
The applicant of the present application owns the following U.S. patent applications, the disclosure of each of which is incorporated herein by reference in its entirety:
U.S. patent application Ser. No. 16/209,423, entitled "METHOD OF COMPRESSING TISSUE WITHIN A STAPLING DEVICE AND SIMULTANNEOUSLY DISPLAYING THE LOCATION OF THE TISSUE WITHIN THE JAWS", now U.S. patent application publication No. US-2019-0200981-A1;
U.S. patent application Ser. No. 16/209,453, entitled "METHOD FOR CONTROLLING SMART ENERGY DEVICES," now U.S. patent application publication No. US-2019-0201046-A1.
Before explaining aspects of the surgical device and generator in detail, it should be noted that the illustrative examples are not limited in their application or use to the details of construction and arrangement of parts illustrated in the accompanying drawings and description. The illustrative examples may be implemented alone or in combination with other aspects, variations and modifications and may be practiced or carried out in various ways. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative examples for the convenience of the reader and are not for the purpose of limitation. Moreover, it is to be understood that the expression of one or more of the aspects, and/or examples described below may be combined with any one or more of the expression of other aspects, and/or examples described below.
Various aspects relate to a screen display for a surgical system for various energy and surgical stapler-based medical devices. Energy-based medical devices include, but are not limited to, radio Frequency (RF) based monopolar and bipolar electrosurgical instruments, ultrasonic surgical instruments, combined RF electrosurgical and ultrasonic instruments, combined RF electrosurgical and mechanical staplers, and the like. The surgical stapler device includes a combination surgical stapler having an electrosurgical device and/or an ultrasonic device. Aspects of the ultrasonic surgical device may be configured to transect and/or coagulate tissue, for example, during a surgical procedure. Aspects of the electrosurgical device may be configured for transecting, coagulating, sealing, welding, and/or desiccating tissue, for example, during a surgical procedure. Aspects of the surgical stapler device can be configured to transect and staple tissue during a surgical procedure, and in some aspects, the surgical stapler device can be configured to deliver RF energy to tissue during a surgical procedure. The electrosurgical device is configured to deliver therapeutic and/or non-therapeutic RF energy to tissue. The elements of the surgical stapler device, electrosurgical device, and ultrasonic device may be used in combination in a single surgical instrument.
In various aspects, the present disclosure provides an on-screen display of real-time information to an OR team during a surgical procedure. In accordance with various aspects of the present disclosure, a number of new and unique screen displays are provided to display various visual information feedback to an OR team on a screen. In accordance with the present disclosure, visual information may include one or more various visual media, whether audible or silent. Generally, visual information includes still photography, movie photography, video or audio recordings, graphic arts, visual aids, models, displays, visual presentation services, and supporting processes. Visual information may be conveyed on any number of display options, such as, for example, a main OR screen, the energy OR surgical stapler device itself, a tablet computer, augmented reality glasses, and the like.
In various aspects, the present disclosure provides a large number of possible lists of options to communicate visual information to an OR team in real-time without providing excessive visual information to the OR team. For example, in various aspects, the present disclosure provides a screen display of visual information to enable a surgeon OR other member of an OR team to selectively activate the screen display, such as an icon surrounding a screen option, to manage a large amount of visual information. The active display may be determined using one or a combination of factors, which may include an energy-based (e.g., electrosurgical, ultrasound) or mechanical-based (e.g., stapler) surgical device in use, estimating the risk associated with a given display, the degree of experience of the surgeon, the surgeon's choice, and so forth. In other aspects, the visual information may include a large amount of data superimposed or overlaid into the surgical field to manage the visual information. In various aspects described below, overlapping images that require video analysis and tracking are included in order to properly overlay data. In contrast to static icons, visual information data transmitted in this manner may provide additional useful visual information to the OR team in a more concise and easily understood manner.
In various aspects, the present disclosure provides techniques for selectively activating a screen display, such as an icon surrounding the screen, to manage visual information during a surgical procedure. In other aspects, the present disclosure provides techniques for determining an active display using one or a combination of various factors. In various aspects, techniques according to the present disclosure may include selecting an energy-based OR mechanical-based surgical device for use as an active display, estimating risk associated with a given display, utilizing the experience level of a surgeon OR OR team making the selection, and so forth.
In other aspects, techniques according to the present disclosure may include overlaying or overlaying a large amount of data onto a surgical field of view to manage visual information. The various display arrangements described in this disclosure relate to superimposing various visual representations of surgical data on a live stream of a surgical field of view. As used herein, the term overlay includes semi-transparent overlays, partial overlays, and/or moving overlays. The graphic overlay may be in the form of a transparent graphic, a translucent graphic, or an opaque graphic, or a combination of transparent, translucent, and opaque elements or effects. Further, the superimposed layers may be positioned on or at least partially on or near objects in the surgical field such as, for example, end effectors and/or critical surgical structures. Some display arrangements may include changes in one or more display elements of the superimposed layers, including changes in color, size, shape, display time, display location, display frequency, highlighting, or combinations thereof, based on changes in display priority values. A graphical overlay is rendered on top of the active display monitor to quickly and efficiently communicate important information to the OR team.
In other aspects, techniques according to the present disclosure may include overlapping images that require analysis of video and tracking in order to properly overlay visual information data. In other aspects, techniques according to the present disclosure may include transmitting rich visual information instead of simple static icons, thereby providing additional visual information to the OR team in a more concise and easily understood manner. In other aspects, the visual overlay may be used in combination with an audible and/or somatosensory overlay (such as thermal, chemical and mechanical devices, and combinations thereof).
The following description relates generally to devices, systems, and methods that provide an Augmented Reality (AR) interactive experience during a surgical procedure. In this context, the surgical field and the images of surgical instruments and other objects present in the surgical field are enhanced by superimposing computer-generated visual, auditory, tactile, somatosensory, olfactory, or other sensory information on the surgical field, the real world images of instruments and other objects present in the surgical field. The image may be streamed in real-time, or may be a still image. Augmented reality is a technology for rendering and displaying virtual or "augmented" virtual objects, data, or visual effects superimposed on a real environment. The real environment may include a surgical field of view. A virtual object superimposed on a real environment may be represented as anchored or in a set position relative to one or more aspects of the real environment. In a non-limiting example, if a real world object leaves the real environment field of view, then a virtual object anchored to the real world object will also leave the augmented reality field of view.
The various display arrangements described in this disclosure relate to superimposing various visual representations of surgical data on a live stream of a surgical field of view. As used herein, the term overlay includes semi-transparent overlays, partial overlays, and/or moving overlays. Further, the superimposed layers may be positioned on or at least partially on or near objects in the surgical field such as, for example, end effectors and/or critical surgical structures. Some display arrangements may include changes in one or more display elements of the superimposed layers, including changes in color, size, shape, display time, display location, display frequency, highlighting, or combinations thereof, based on changes in display priority values.
As described herein, AR is an enhanced version of the real physical world, achieved through the use of digital visual elements, sounds, or other sensory stimuli delivered via technology. Virtual Reality (VR) is a computer-generated environment with scenes and objects that appear to be real, so that users feel themselves immersed in their surroundings. The environment is perceived by a device called a virtual reality headset or helmet. Both Mixed Reality (MR) and AR are considered immersive techniques, but they are not identical. MR is an extension of mixed reality, allowing real and virtual elements to interact in an environment. While AR often adds digital elements to a real-time view through the use of cameras, MR experiences combine elements of both AR and VR, in which real world and digital objects interact.
In an AR environment, one or more computer-generated virtual objects may be displayed with one or more real (i.e., so-called "real world") elements. For example, real-time images or videos of the surrounding environment may be displayed on a computer screen display along with one or more overlaid virtual objects. Such virtual objects may provide supplemental information about the environment or generally enhance the user's perception and participation in the environment. Instead, real-time images or videos of the surrounding environment may additionally or alternatively enhance user engagement with virtual objects shown on the display.
Apparatus, systems, and methods in the context of the present disclosure enhance images received from one or more imaging devices during a surgical procedure. The imaging device may include various endoscopes, AR devices, and/or cameras used during non-invasive and minimally invasive surgical procedures to provide images during open surgical procedures. The image may be streamed in real-time, or may be a still image. The devices, systems, and methods enhance images of a real-world surgical environment by overlaying virtual objects or representations of data and/or real objects on the real-world surgical environment, thereby providing an augmented reality interactive experience. The augmented reality experience may be viewed on a display and/or AR device that allows a user to view the overlaid virtual object on the real world surgical environment. The display may be located in the operating room or remote from the operating room. AR devices are worn on the head of a surgeon or other operating room personnel and typically include two stereoscopic display lenses or screens, one for each eye of the user. Natural light can pass through two transparent or translucent display lenses so that aspects of the real environment are visible, while also projecting light so that the virtual object is visible to the user of the AR device.
Two or more displays and AR devices may be used in a coordinated manner, such as a first display or AR device controlling one or more additional displays or AR devices in a defined character control system. For example, when the display or AR device is activated, the user may select a role (e.g., surgeon, surgical assistant, nurse, etc. during a surgical procedure) and the display or AR device may display information related to the role. For example, the surgical assistant may have a virtual representation of the displayed instrument that the surgeon needs when performing the next step of the surgical procedure. The surgeon's attention to the current step may see different display information than the surgical assistant.
While there are many known screen displays and warnings, the present disclosure provides many new and unique augmented reality interactive experiences during a surgical procedure. Such augmented reality interactive experiences include visual, auditory, tactile, somatosensory, olfactory, or other sensory feedback information to a surgical team inside or outside the operating room. Virtual feedback information superimposed on the real world surgical procedure environment may be provided to an Operating Room (OR) team, including personnel internal to the OR, including, but not limited to, for example, a knife surgeon, a surgeon assistant, a swabbing personnel, an anesthesiologist, and a round nurse, among others. The virtual feedback information may be transmitted over any number of display options, such as a master OR screen display, AR device, energy OR surgical stapler instrument, tablet computer, augmented reality glasses, device, and the like.
Fig. 1 shows a computer-implemented interactive surgical system 1 comprising one or more surgical systems 2 and a cloud-based system 4. The cloud-based system 4 may include a remote server 13 coupled to the storage 5. Each surgical system 2 includes at least one surgical hub 6 in communication with the cloud 4. For example, the surgical system 2 may include a visualization system 8, a robotic system 10, and a hand-held intelligent surgical instrument 12, each configured to communicate with each other and/or with the hub 6. In some aspects, the surgical system 2 may include M hubs 6, N visualization systems 8, O robotic systems 10, and P smart handheld surgical instruments 12, where M, N, O and P are integers greater than or equal to 1. The computer-implemented interactive surgical system 1 may be configured to provide an augmented reality interactive experience during a surgical procedure as described herein.
Fig. 2 shows an example of a surgical system 2 for performing a surgical procedure on a patient lying on an operating table 14 in a surgical operating room 16. The robotic system 10 is used as part of the surgical system 2 in a surgical procedure. The robotic system 10 includes a surgeon's console 18, a patient side cart 20 (surgical robot), and a surgical robotic hub 22. When the surgeon views the surgical site through the surgeon's console 18 or an Augmented Reality (AR) device 66 worn by the surgeon, the patient-side cart 20 may manipulate at least one removably coupled surgical tool 17 through a minimally invasive incision in the patient. An image of the surgical site of the minimally invasive surgical procedure (e.g., a still image or a live image streamed in real time) may be obtained by the medical imaging device 24. The patient side cart 20 may maneuver the imaging device 24 to orient the imaging device 24. An image of the open surgical procedure may be obtained by the medical imaging device 96. The robotic hub 22 processes the image of the surgical site for subsequent display on the surgeon's console 18 or on an AR device 66 worn by the surgeon or to other personnel in the surgical room 16.
The optical components of imaging device 24, 96 or AR device 66 may include one or more illumination sources and/or one or more lenses. One or more illumination sources may be directed to illuminate multiple portions of the surgical field. The one or more image sensors may receive light reflected or refracted from tissue and instruments in the surgical field.
In various aspects, the imaging device 24 is configured for use in minimally invasive surgical procedures. Examples of imaging devices suitable for use in the present disclosure include, but are not limited to, arthroscopes, angioscopes, bronchoscopes, choledochoscopes, colonoscopes, cytoscopes, duodenoscopes, enteroscopes, esophageal-duodenal scopes (gastroscopes), endoscopes, laryngoscopes, nasopharyngeal-renal endoscopes, sigmoidoscopes, thoracoscopes, and hysteroscopes. In various aspects, the imaging device 96 is configured for use in an open (invasive) surgical procedure.
In various aspects, the visualization system 8 includes one or more imaging sensors, one or more image processing units, one or more storage arrays, and one or more displays strategically placed relative to the sterile field. In one aspect, the visualization system 8 includes interfaces for HL7, PACS, and EMR. In one aspect, the imaging device 24 may employ multispectral monitoring to distinguish between topography and underlying structures. Multispectral images capture image data over a specific range of wavelengths across the electromagnetic spectrum. Wavelengths, including light from frequencies outside the visible range, such as IR and ultraviolet, are separated by filters or devices sensitive to the particular wavelengths. Spectral imaging can extract information that is not visible to the human eye. Multispectral monitoring can reposition the surgical field after completion of the surgical task to perform a test on the treated tissue.
Fig. 2 shows the main display 19 positioned in the sterile field to be visible to an operator at the operating table 14. The visualization tower 11 is positioned outside the sterile zone and comprises a first non-sterile display 7 and a second non-sterile display 9 facing away from each other. The visualization system 8 guided by the hub 6 is configured to be able to coordinate the information flow to operators inside and outside the sterile field using the displays 7, 9, 19. For example, hub 6 may cause visualization system 8 to display AR images of the surgical site recorded by imaging devices 24, 96 on non-sterile displays 7, 9 or by AR device 66, while maintaining a real-time feed of the surgical site on primary display 19 or AR device 66. For example, the non-sterile displays 7, 9 may allow a non-sterile operator to perform diagnostic steps related to a surgical procedure.
Fig. 3 shows the surgical hub 6 in communication with the visualization system 8, the robotic system 10, and the hand-held intelligent surgical instrument 12. Hub 6 includes a hub display 35, an imaging module 38, a generator module 40, a communication module 30, a processor module 32, a memory array 34, and an operating room mapping module 33. The hub 6 further comprises a smoke evacuation module 26 and/or a suction/flushing module 28. In various aspects, the imaging module 38 includes an AR device 66 and the processor module 32 includes an integrated video processor and augmented reality modeler (e.g., as shown in fig. 10). The modular light source may be adapted for use with a variety of imaging devices. In various examples, multiple imaging devices may be placed at different locations in the surgical field to provide multiple views (e.g., non-invasive, minimally invasive, or open surgical procedures). The imaging module 38 may be configured to be switchable between imaging devices to provide an optimal view. In various aspects, the imaging module 38 may be configured to integrate images from different imaging devices and provide an augmented reality interactive experience during a surgical procedure as described herein.
Fig. 4 shows a surgical data network 51 including a modular communication hub 53 configured to enable connection of modular devices located in one or more operating rooms/rooms of a medical facility to a cloud-based system. Cloud 54 may include a remote server 63 (fig. 5) coupled to storage 55. Modular communication hub 53 includes a network hub 57 and/or a network switch 59 in communication with a network router 61. Modular communication hub 53 is coupled to local computer system 60 to process data. The modular devices 1a-1n located in the operating room may be coupled to a modular communication hub 53. The network hub 57 and/or the network switch 59 may be coupled to a network router 61 to connect the devices 1a-1n to the cloud 54 or the local computer system 60. The data associated with the devices 1a-1n may be transmitted via routers to cloud-based computers for remote data processing and manipulation. The operating room devices 1a-1n may be connected to the modular communication hub 53 by a wired channel or a wireless channel. The surgical data network 51 environment can be used to provide an augmented reality interactive experience during a surgical procedure as described herein, and in particular to provide an augmented image in a surgical field of view to one or more remote displays 58.
Fig. 5 illustrates a computer-implemented interactive surgical system 50. The computer-implemented interactive surgical system 50 is similar in many respects to the computer-implemented interactive surgical system 1. The computer-implemented interactive surgical system 50 includes one or more surgical systems 52 that are similar in many respects to the surgical system 2. Each surgical system 52 includes at least one surgical hub 56 in communication with a cloud 54, which may include a remote server 63. In one aspect, the computer-implemented interactive surgical system 50 includes a modular control tower 23 that is connected to a plurality of operating room devices, such as intelligent surgical instruments, robots, and other computerized devices located in an operating room. As shown in fig. 6, modular control tower 23 includes a modular communication hub 53 coupled to a computer system 60.
Returning to fig. 5, modular control tower 23 is coupled to imaging module 38 (which is coupled to endoscope 98), generator module 27 (which is coupled to energy device 99), smoke extractor module 76, suction/irrigation module 78, communication module 13, processor module 15, storage array 16, smart device/appliance 21 (which is optionally coupled to display 39), and sensor module 29. The operating room devices are coupled to cloud computing resources, such as servers 63, data storage 55, and display 58, via modular control tower 23. The robotic hub 72 may also be connected to the modular control tower 23 and to the server 63, the data storage 55, and the display 58. The device/instrument 21, visualization system 58, etc. may be coupled to the modular control tower 23 via a wired or wireless communication standard or protocol, as described herein. The modular control tower 23 may be coupled to a hub display 65 (e.g., monitor, screen) to display the received enhanced images, including overlaid virtual objects in the real surgical field received from the imaging module 38, the device/instrument display 39, and/or other visualization system 58. Hub display 65 may also display data received from devices connected to modular control tower 23 in combination with the image and the overlay image.
Fig. 6 shows a surgical hub 56 that includes a plurality of modules coupled to the modular control tower 23. The modular control tower 23 includes a modular communication hub 53 (e.g., a network connectivity device) and a computer system 60 to provide, for example, enhanced local processing, visualization, and imaging of surgical information. Modular communication hub 53 may be hierarchically configured to connect to extend the number of modules (e.g., devices) that may be connected to modular communication hub 53 and to transmit data associated with the modules to computer system 60, cloud computing resources, or both. Each of the hubs 57/switches 59 in the modular communications hub 53 may include three downstream ports and one upstream port. The upstream hub 57/switch 59 is connected to the processor 31 to provide a communication connection with cloud computing resources and a local display 67. Communication with cloud 54 may be through a wired or wireless communication channel.
The computer system 60 includes a processor 31 and a network interface 37. The processor 31 is coupled to a communication module 41, a storage device 45, a memory 46, a non-volatile memory 47 and an input/output interface 48 via a system bus. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any of a variety of available bus architectures.
The processor 31 includes an augmented reality modeler (e.g., as shown in fig. 10) and may be implemented as a single or multi-core processor, such as those provided by texas instruments (Texas Instruments) under the trade name ARM Cortex. In one aspect, the processor may be an on-chip memory from, for example, texas instruments (Texas Instruments) LM4F230H5QR ARM Cortex-M4F processor core including 256KB of single-cycle flash memory or other non-volatile memory (up to 40 MHz), a prefetch buffer for improving execution above 40MHz, 32KB single-cycle Sequential Random Access Memory (SRAM), loaded with StellarisInternal read-only memory (ROM) of software, 2KB electrically erasable programmable read-only memory (EEPROM), and/or one or more Pulse Width Modulation (PWM) modules, one or more orthogonal codesThe analog input (QEI) is analog, one or more 12-bit analog-to-digital converters (ADCs) with 12 analog input channels, the details of which can be seen in the product data sheet.
The system memory includes volatile memory and nonvolatile memory. A basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer system, such as during start-up, is stored in nonvolatile memory. For example, the non-volatile memory may include ROM, programmable ROM (PROM), electrically Programmable ROM (EPROM), EEPROM, or flash memory. Volatile memory includes Random Access Memory (RAM), which acts as external cache memory. Further, the RAM may be available in various forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM) Enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
Computer system 60 also includes removable/non-removable, volatile/nonvolatile computer storage media such as, for example, magnetic disk storage. Disk storage includes, but is not limited to, devices such as magnetic disk drives, floppy disk drives, tape drives, jaz drives, zip drives, LS-60 drives, flash memory cards, or memory sticks. In addition, the disk storage can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), compact disk recordable drive (CD-R drive), compact disk rewritable drive (CD-RW drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of the disk storage devices to the system bus, a removable or non-removable interface may be used.
In various aspects, the computer system 60 of fig. 6, the imaging module 38 and/or the visualization system 58 and/or the processor module 15 of fig. 4-6 may include an image processor, an image processing engine, a Graphics Processing Unit (GPU), a media processor, or any special-purpose Digital Signal Processor (DSP) for processing digital images. The image processor may employ parallel computation with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) techniques to increase speed and efficiency. The digital image processing engine may perform a series of tasks. The image processor may be a system on a chip having a multi-core processor architecture.
Fig. 7 shows an augmented reality system 263 that includes an intermediate signal combiner 64 positioned in the communication path between the imaging module 38 and the surgical hub display 67. The signal combiner 64 combines audio and/or image data received from the imaging module 38 and/or the AR device 66. The surgical hub 56 receives the combined data from the combiner 64 and superimposes the data provided to the display 67 on which the superimposed data is displayed. Imaging device 68 may be a digital video camera and audio device 69 may be a microphone. The signal combiner 64 may include a wireless heads-up display adapter to couple to an AR device 66 placed in the communication path of the display 67 to the console, allowing the surgical hub 56 to superimpose data on the display 67.
Fig. 8 illustrates an Augmented Reality (AR) system including an intermediate signal combiner positioned in a communication path between an imaging module and a surgical hub display. Fig. 8 shows AR device 66 worn by surgeon 73 to transmit data to surgical hub 56. Peripheral information of AR device 66 does not include active video. Instead, the peripheral information includes only the device settings or signals that do not have the same refresh rate requirements. The interaction may augment the surgeon's 73 information based on linking with preoperative Computed Tomography (CT) or other data linked in the surgical hub 56. The AR device 66 may identify structures, such as querying whether the instrument is contacting a nerve, vessel, or adhesion. The AR device 66 may include preoperative scan data, optical views, tissue interrogation features obtained throughout the procedure, and/or processing in the surgical hub 56 for providing answers. The surgeon 73 may dictate notes to the AR device 66 for storage in the hub storage device 45 with the patient data for later reporting or follow-up.
The AR device 66 worn by the surgeon 73 is linked to the surgical hub 56 with audio and visual information to avoid the need for superposition and to allow the display information to be customized around the perimeter of the field of view. The AR device 66 provides a signal from the device (e.g., instrument), answers queries about device settings or location information linked to the video to identify quadrants or locations. The AR device 66 has audio control and audio feedback from the AR device 66. The AR device 66 is able to interact with all other systems in the operating room and has feedback and interaction available wherever the surgeon 73 looks. For example, the AR device 66 may receive voice or gesture initiated commands and inquiries from the surgeon, and the AR device 66 may provide feedback in the form of one or more modalities including audio, visual, or tactile touches.
Fig. 9 shows a surgeon 73 wearing AR device 66, a patient 74, and a camera 96 may be included in operating room 75. The AR device 66 worn by the surgeon 73 may be used to present virtual objects superimposed on a real-time image of the surgical field to the surgeon 73 via an augmented reality display 89 or via a hub-connected display 67. The real-time image may include a portion of the surgical instrument 77. The virtual object may not be visible to other people (e.g., surgical assistants or nurses) within the operating room 75, although they may also be wearing the AR device 66. Even if another person is viewing operating room 75 using AR device 66, that person may not see the virtual object or be able to see the virtual object in the augmented reality shared with surgeon 73, or be able to see a modified version of the virtual object (e.g., according to a unique customization to surgeon 73) or be able to see a different virtual object.
The virtual objects and/or data may be configured to appear on a portion of the surgical instrument 77 or in a surgical field captured by the imaging module 38, by the imaging device 68 during a minimally invasive surgical procedure, and/or by the camera 96 during an open surgical procedure. In the illustrated example, the imaging module 38 is a laparoscopic camera that provides real-time feeding of the surgical field during a minimally invasive surgical procedure. The AR system may present a virtual object that is fixed to a real object regardless of the perspective of one or more observers (e.g., surgeon 73) of the AR system. For example, the virtual object may be visible to an observer of the AR system inside the operating room 75, and not visible to an observer of the AR system outside the operating room 75. When an observer enters operating room 75, the virtual object may be displayed to the observer outside operating room 75. The augmented image may be displayed on the surgical hub display 67 or the augmented reality display 89.
The AR device 66 may include one or more screens or lenses, such as a single screen or two screens (e.g., one screen for each eye of the user). The screen may allow light to pass through the screen such that aspects of the real environment are visible when the virtual object is displayed. The virtual object may be made visible to the surgeon 73 by projected light. The virtual object may appear to have some degree of transparency or may be opaque (i.e., occlude aspects of the real environment).
The AR system may be viewable to one or more observers and may include differences between views available to one or more observers while maintaining some aspects common between views. For example, the heads-up display may change between two views, while virtual objects and/or data may be fixed to real objects or regions in the two views. Aspects, such as color, illumination, or other changes of the object, may be changed between views without changing the fixed location of the at least one virtual object.
The user may consider virtual objects and/or data presented in the AR system as opaque or as including a degree of transparency. In one example, a user may interact with a virtual object, such as by moving the virtual object from a first location to a second location. For example, the user may move the object with his own hand. This may be accomplished virtually in an AR system by determining that the hand has moved to a position coincident with or adjacent to the object (e.g., using one or more cameras that may be mounted on the AR device 66, such as AR device camera 79 or a separate camera 96, and which may be static or controllable to move) and causing the object to move in response. The virtual aspect may comprise a virtual representation of the real world object or may comprise a visual effect, such as a lighting effect or the like. The AR system may include rules that govern the behavior of the virtual object, such as subjecting the virtual object to gravity or friction, or may include other predefined rules that exclude real world physical constraints (e.g., floating objects, perpetuation, etc.). The AR device 66 may include a camera 79 (not confused with camera 96, separate from the AR device 66) located on the AR device 66. The AR device camera 79 or camera 96 may include an infrared camera, an infrared filter, a visible light filter, multiple cameras, a depth camera, and the like. The AR device 66 may project the virtual item on a representation of the real environment so as to be viewable by the user.
The AR device 66 may be used in an operating room 75 during a surgical procedure, for example, performed on a patient 74 by a surgeon 73. The AR device 66 may project or display virtual objects such as virtual objects during a surgical procedure to enhance the vision of the surgeon. The surgeon 73 may view the virtual object using the AR device 66, a remote control for the AR device 66, or may interact with the virtual object, such as using a hand to "interact" with the virtual object or a gesture recognized by a camera 79 of the AR device 66. The virtual object may augment a surgical tool, such as surgical instrument 77. For example, the virtual object may appear (to the surgeon 73 viewing the virtual object through the AR device 66) to be coupled to or maintained a fixed distance from the surgical instrument 77. In another example, the virtual object may be used to guide a surgical instrument 77 and may appear to be fixed to the patient 74. In some examples, the virtual object may react to movement of other virtual objects or real world objects in the surgical field of view. For example, the virtual object may be changed while the surgeon is manipulating a surgical instrument that is proximate to the virtual object.
The augmented reality display system imaging device 38 captures a real image of the surgical field during the surgical procedure. The augmented reality displays 89, 67 present a superposition of the operational aspects of the surgical instrument 77 over the real image of the surgical field. Surgical instrument 77 includes communication circuitry 231 to communicate operational aspects and functional data from surgical instrument 77 to AR device 66 via communication circuitry 233 on AR device 66. Although surgical instrument 77 and AR device 66 are shown as RF wireless communication between circuits 231, 233 as indicated by arrow B, C, other communication techniques (e.g., wired, ultrasonic, infrared, etc.) may be employed. The overlay is related to the operational aspects of the active visualization of the surgical instrument 77. The overlay combines aspects of tissue interaction in the surgical field with functional data from the surgical instrument 77. The processor portion of AR device 66 is configured to receive operational aspects and functional data from surgical instrument 77, determine overlays relating to the operation of surgical instrument 77, and combine tissue aspects in the surgical field with functional data from surgical instrument 77. The enhanced image indicates warnings regarding device performance considerations, warnings of incompatible usage, warnings regarding incomplete capture. Incompatible uses include tissue out of range conditions and tissue not properly balanced within the jaws of the end effector. The additional enhanced image provides an indication of an incident, including an indication of tissue tension and an indication of foreign object detection. Other enhanced image indicates device status overlay and instrument indication.
Fig. 10 illustrates a system 83 for enhancing a surgical field image with information using an AR display 89 in accordance with at least one aspect of the present disclosure. The system 83 may be used to perform the techniques described below, for example, by using the processor 85. The system 83 includes one aspect of the AR device 66 that may communicate with a database 93. The AR device 66 includes a processor 85, a memory 87, an AR display 89, and a camera 79. The AR device 66 may include a sensor 90, a speaker 91, and/or a haptic controller 92. Database 93 may include an image store 94 or a pre-operative plan store 95.
The processor 85 of the AR device 66 includes an augmented reality modeler 86. The augmented reality modeler 86 may be used by the processor 85 to create an augmented reality environment. For example, the augmented reality modeler 86 may receive an image of an instrument in a surgical field of view, such as from the camera 79 or the sensor 90, and create an augmented reality environment to fit within a displayed image of the surgical field of view. In another example, physical objects and/or data may be superimposed on the surgical field of view and/or the surgical instrument image, and the augmented reality modeler 86 may use the physical objects and data to present an augmented reality display of the virtual objects and/or data in the augmented reality environment. For example, the augmented reality modeler 86 may use or detect an instrument at a surgical site of a patient and present virtual objects and/or data on the surgical instrument and/or images of the surgical site in a surgical field captured by the camera 79. The AR display 89 may display an AR environment superimposed on a real environment. Display 89 may display virtual objects and/or data using AR device 66, such as a fixed location in an AR environment.
The AR device 66 may include a sensor 90, such as an infrared sensor. The camera 79 or sensor 90 may be used to detect movements, such as gestures by a surgeon or other user, which the processor 85 may interpret as user attempts or intended interactions with the virtual target. The processor 85 may identify objects in the real environment, such as by using the processing information received by the camera 79. In other aspects, the sensor 90 may be a tactile, auditory, chemical, or thermal sensor to generate corresponding signals that may be combined with various data feeds to create an enhanced environment. The sensors 90 may include binaural audio sensors (spatial sound), inertial measurement sensors (accelerometers, gyroscopes, magnetometers), environmental sensors, depth camera sensors, hand-eye tracking sensors, and voice command recognition functions.
For example, during a surgical procedure, the AR display 89 may present virtual features corresponding to physical features hidden by anatomical aspects of the patient, such as within the surgical field, while allowing the surgical field to be viewed through the AR display 89. The virtual feature may have a virtual position or orientation corresponding to a first physical position or orientation of the physical feature. In one example, the virtual position or orientation of the virtual feature may include an offset from a first physical position or orientation of the physical feature. The offset may include a predetermined distance from the augmented reality display, a relative distance from the augmented reality display to an anatomical aspect, and the like.
In one example, the AR device 66 may be a single AR device. In one aspect, AR device 66 may be a Hololens 2AR device manufactured by Microsoft corporation of Redmond, wash. The AR device 66 includes goggles with lenses and binaural audio features (spatial sound), inertial measurement devices (accelerometers, gyroscopes, magnetometers), environmental sensors, depth and video cameras, hand-eye tracking, and voice command recognition functions. It provides an improved field of view with high resolution by using a mirror to guide the waveguide in front of the wearer's eye. The image can be magnified by changing the angle of the mirror. It also provides eye tracking to identify the user and adjust the lens width for the particular user.
In another example, the AR device 66 may be a Snapchat Spectacles 3AR device. The AR device is able to capture paired images and recreate a 3D depth map, add virtual effects, and replay 3D video. The AR device includes two HD cameras to capture 3D photos and videos at 60fps while four built-in microphones record immersive high-fidelity audio. The images from the two cameras combine to build a geometric map around the user's real world, providing a new perception of depth. The photos and videos may be synchronized to the external display device wirelessly.
In yet another example, AR device 66 may be a Google Glass 2AR device. The AR device provides inertial measurement (accelerometer, gyroscope, magnetometer) information superimposed on the lens (outside the field of view) to supplement the information.
In another example, the AR device 66 may be Amazon's Echo Frames AR device. The AR device has no camera/display. The microphone and speaker are connected to Alexa. The AR device provides less functionality than a heads-up display.
In yet another example, AR device 66 may be a North (Google) Focals AR device. The AR device provides notification push/smart watch simulation; inertial measurement, screen overlay of information (weather, calendar, messages), voice control (Alexa) integration. The AR device provides a basic heads-up display function.
In another example, the AR device 66 may be an Nreal AR device. The AR device includes spatial sound, two ambient cameras, photo cameras, IMU (accelerometer, gyroscope), ambient light sensor, proximity sensor functions. Nebula projects application information onto the lens.
In various other examples, AR device 66 may be any of the following commercially available AR devices: magic Leap 1, epson Moverio, vuzix Blade AR, zenFoneAR, microsoft AR eyeglass prototype, eyeTap to generate light collinear with ambient light directly into the retina. For example, the beam splitter makes the same light seen by the eye available for computer processing and superimposing information. AR visualization systems include HUDs, contact lenses, glasses, virtual Reality (VR) headphones, virtual retinal displays, operating room displays, and/or smart contact lenses (biomimetic lenses).
The multi-user interface for AR device 66 includes a virtual retinal display (such as a raster display drawn directly on the retina rather than on a screen in front of the eye), a smart television, a smart phone, and/or a spatial display (such as a Sony spatial display system).
Other AR technologies may include, for example, AR capture devices and software applications, AR creation devices and software applications, and AR cloud devices and software applications. The AR capture device and software applications include, for example, apple polycyam application u binary 6 (Mirrorworld using display. Land app), the user can scan and obtain 3D images of the real world (to create a 3D model). AR creation devices and software applications include, for example, adobe Aero, vuforia, ARToolKit, google ARCore, apple arcet, MAXST, aurasma, zappar, blippar. AR cloud devices and software applications include, for example, facebook, google (world geometry, object recognition, predictive data), amazon AR cloud (business), microsoft Azure, samsung Project Whare, niantic, magic Leap.
Situational awareness refers to the ability of some aspects of a surgical system to determine or infer information related to a surgical procedure from data received from databases and/or instruments. The information may include the type of surgery being performed, the type of tissue being operated on, or the body cavity being the subject of the surgery. With context information associated with the surgical procedure, the surgical system may, for example, improve the manner in which the surgical system controls the modular devices (e.g., robotic arms and/or robotic surgical tools) connected thereto, and provide context information or advice to the surgeon during the course of the surgical procedure.
Fig. 11 shows a time axis of a situational awareness surgical procedure. Fig. 11 shows a timeline 5200 of an exemplary surgical procedure and context information that the surgical hub 5104 can derive from data received from the data source 5126 at each step of the surgical procedure. The time axis 5200 depicts typical steps that nurses, surgeons and other medical personnel will take during a segmental lung removal procedure, starting from the establishment of an operating room and until the patient is transferred to a post-operative recovery room. The situation aware surgical hub 5104 receives data from the data source 5126 throughout the surgical procedure, including data generated each time a medical professional utilizes the modular device 5102 paired with the surgical hub 5104. The surgical hub 5104 can receive this data from the paired modular device 5102 and other data sources 5126 and continually derive inferences about ongoing surgery (i.e., background information), such as which step of the surgery to perform at any given time, as new data is received. The situational awareness system of the surgical hub 5104 can, for example, record data related to the procedure used to generate the report, verify steps that medical personnel are taking, provide data or cues (e.g., via a display screen) that may be related to a particular procedure, adjust the modular device 5102 based on context (e.g., activate a monitor, adjust the FOV of a medical imaging device, or change the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other such action described herein.
First 5202, a hospital staff member retrieves the patient's EMR from the hospital's EMR database. Based on patient data selected in the EMR, the surgical hub 5104 determines that the procedure to be performed is a thoracic procedure.
Second 5204, the staff member scans the incoming medical supplies for the procedure. The surgical hub 5104 cross-compares the scanned supplies with the list of supplies utilized in the various types of protocols and confirms that the combination of supplies corresponds to the chest protocol. In addition, the surgical hub 5104 can also determine that the procedure is not a wedge procedure (because the incoming supplies lack certain supplies required for, or otherwise do not correspond to, a chest wedge procedure).
Third 5206, the medical personnel scans the patient belt via a scanner 5128 communicatively connected to the surgical hub 5104. The surgical hub 5104 may then confirm the identity of the patient based on the scanned data.
Fourth 5208, the medical staff opens the auxiliary equipment. The auxiliary devices utilized may vary depending on the type of surgery and the technique to be used by the surgeon, but in this exemplary case they include smoke evacuators, insufflators and medical imaging devices. When activated, the ancillary equipment as the modular device 5102 may automatically pair with the surgical hub 5104 located in a specific vicinity of the modular device 5102 as part of its initialization process. The surgical hub 5104 may then derive background information about the surgical procedure by detecting the type of modular device 5102 paired therewith during this pre-operative or initialization phase. In this particular example, the surgical hub 5104 determines that the surgical procedure is a VATS procedure based on this particular combination of paired modular devices 5102. Based on a combination of data from the patient's EMR, a list of medical supplies to be used in the procedure, and the type of modular device 5102 connected to the hub, the surgical hub 5104 can generally infer the particular procedure that the surgical team will perform. Once the surgical hub 5104 knows the particular procedure being performed, the surgical hub 5104 can retrieve the procedure from memory or the cloud and then cross-reference the data it subsequently receives from the connected data sources 5126 (e.g., the modular device 5102 and the patient monitoring device 5124) to infer the procedure being performed by the surgical team.
Fifth 5210, the staff attaches EKG electrodes and other patient monitoring devices 5124 to the patient. The EKG electrode and other patient monitoring device 5124 can be paired with the surgical hub 5104. As the surgical hub 5104 begins to receive data from the patient monitoring device 5124, the surgical hub 5104 thus confirms that the patient is in the operating room.
Sixth 5212, medical personnel induce anesthesia in patients. The surgical hub 5104 may infer that the patient is under anesthesia based on data (including EKG data, blood pressure data, ventilator data, or a combination thereof) from the modular device 5102 and/or the patient monitoring device 5124. At the completion of the sixth step 5212, the preoperative portion of the lung segmental resection procedure is completed and the operative portion begins.
Seventh 5214, collapse of the lungs of the patient being operated on (while ventilation is switched to the contralateral lung). The surgical hub 5104 may infer from the ventilator data that the patient's lungs have collapsed. The surgical hub 5104 can infer that the surgical portion of the procedure has begun, as it can compare the detection of the patient's lung collapse to the expected steps of the procedure (which can be previously accessed or retrieved), thereby determining that collapsing the lung is a surgical step in that particular procedure.
Eighth 5216, a medical imaging device 5108 (e.g., an endoscope) is inserted and video from the medical imaging device is activated. The surgical hub 5104 receives medical imaging device data (i.e., still image data or live streaming video in real time) through its connection with the medical imaging device. After receiving the medical imaging device data, the surgical hub 5104 may determine that the laparoscopic portion of the surgical procedure has begun. In addition, the surgical hub 5104 may determine that the particular procedure being performed is a segmental resection, rather than a pneumonectomy (note that the surgical hub 5104 has excluded wedge-shaped procedures based on the data received at the second step 5204 of the procedure). The data from the medical imaging device 124 (fig. 2) may be used to determine background information related to the type of procedure being performed in a number of different ways, including by determining the angle of the visual orientation of the medical imaging device relative to the patient's anatomy, monitoring the number of medical imaging devices utilized (i.e., activated and paired with the surgical hub 5104), and monitoring the type of visualization device utilized.
For example, one technique for performing a vat lobectomy places the camera in the lower anterior corner of the patient's chest over the septum, while one for performing a vat segmented resection places the camera in an anterior intercostal position relative to the segmented slit. For example, using pattern recognition or machine learning techniques, the situational awareness system may be trained to recognize the positioning of the medical imaging device from the visualization of the patient anatomy. As another example, one technique for performing a vat lobectomy utilizes a single medical imaging apparatus, while another technique for performing a vat segmented excision utilizes multiple cameras. As another example, a technique for performing vat segmental resections utilizes an infrared light source (which may be communicatively coupled to a surgical hub as part of a visualization system) to visualize segmental slots that are not used in vat pulmonary resections. By tracking any or all of this data from the medical imaging device 5108, the surgical hub 5104 can thus determine the particular type of surgical procedure being performed and/or the technique for the particular type of surgical procedure.
Ninth 5218, the surgical team begins the anatomic steps of the procedure. The surgical hub 5104 can infer that the surgeon is in the process of dissecting to mobilize the patient's lungs because it receives data from the RF generator or ultrasound generator indicating that the energy instrument is being fired. The surgical hub 5104 can cross-reference the received data with a retrieval step of the surgical procedure to determine that the energy instrument fired at that point in the method (i.e., after completion of the previously discussed surgical step) corresponds to an anatomical step.
Tenth 5220, the surgical team proceeds to the ligation step of the procedure. The surgical hub 5104 can infer that the surgeon is ligating arteries and veins because it receives data from the surgical stapling and severing instrument indicating that the instrument is being fired. Similar to the previous steps, the surgical hub 5104 can derive the inference by cross-referencing the receipt of data from the surgical stapling and severing instrument with the retrieval steps in the method.
Eleventh 5222, a segmental resection portion of the procedure is performed. The surgical hub 5104 infers that the surgeon is transecting soft tissue based on data from the surgical instrument, including data from the staple cartridge. The cartridge data may correspond to the size or type of staples fired by the instrument. The cartridge data may indicate the type of tissue being stapled and/or transected for different types of staples employed in different types of tissue. The type of staples being fired is used for soft tissue or other tissue types to enable the surgical hub 5104 to infer that a segmental resection procedure is being performed.
Twelfth 5224, the node dissection step is performed. The surgical hub 5104 may infer that the surgical team is dissecting a node and performing a leak test based on data received from the generator indicating that an RF or ultrasonic instrument is being fired. For this particular procedure, the use of an RF or ultrasonic instrument after transecting the soft tissue corresponds to a node dissection step, which allows the surgical hub 5104 to make this inference. It should be noted that the surgeon switches back and forth between surgical stapling/cutting instruments and surgical energy (i.e., RF or ultrasonic) instruments periodically, depending on the particular step in the procedure, as the different instruments are better suited for the particular task. Thus, the particular sequence in which the stapling/severing instrument and the surgical energy instrument are used may dictate the steps of the procedure that the surgeon is performing. At the completion of the twelfth step 5224, the incision is closed and the post-operative portion of the procedure begins.
Thirteenth 5226, the patient is reversed from anesthesia. For example, the surgical hub 5104 may infer that the patient is waking from anesthesia based on ventilator data (i.e., the patient's respiration rate begins to increase).
Finally, fourteenth 5228, the medical personnel remove various patient monitoring devices 5124 from the patient. Thus, when the surgical hub 5104 loses EKG, BP and other data from the patient monitoring device 5124, the hub can infer that the patient is being transferred to the recovery room. The surgical hub 5104 can determine or infer when each step of a given surgical procedure occurs from data received from various data sources 5126 communicatively coupled to the surgical hub 5104.
In addition to using patient data from the EMR database to infer the type of surgical procedure to be performed, the situational awareness surgical hub 5104 may also use patient data to generate control adjustments for the paired modular device 5102, as shown in a first step 5202 of the timeline 5200 shown in fig. 11.
Visualizing a mixture directly with rendering elements to display the mixture
Element and actions occurring on-screen and off-screen
Having described the general implementation of the various surgical systems, surgical hubs, communication systems, augmentation systems, and augmented reality devices disclosed herein, such as surgical systems 1, 2, 50, 52, surgical hubs 6, 56, 5104, communication system 63, visualization system 8, augmentation system 83, imaging devices 24, 96, and AR devices 66, 84, the present disclosure now turns to describing various other implants of systems, hubs, and devices. For the sake of brevity, various details and implementations of systems, hubs, and devices described in the following sections that are similar to the various systems, hubs, and devices described above are not repeated herein. Any aspects of the systems, hubs, and devices described below may be incorporated into and/or implemented by the above systems, hubs, and devices.
As described above, augmented reality display devices and other types of display devices may be used to provide a superposition of information to an Operating Room (OR) staff during a surgical procedure. In some aspects, these overlays can include information related to the procedure of the procedure being performed by the OR staff. Thus, the information displayed by the overlay may need to be based on the surgical instrument being used by the staff member or the area of the surgical field in which the staff member is working. However, in surgery, there are often multiple OR staff interacting with a wide variety of surgical instruments and other objects. In addition, surgeons, nurses, and assistants may all work in and around the surgical field at different times. Thus, each worker can move and manipulate multiple surgical instruments throughout the operating room, transferring the instruments to each other and placing the instruments aside when not in use. Given the constantly changing conditions in an OR, it may be difficult for a surgical system to track and organize the information that needs to be displayed to various staff on various devices throughout the surgical procedure. Accordingly, there is a need for apparatus, systems, and methods for tracking multiple users and objects within an OR so that related information may be properly displayed by various augmented reality and other display devices.
Further, at different times during the surgical procedure, staff, instruments, and other objects may enter and leave the field of view of the various imaging devices of the surgical system, such as imaging devices configured to be able to capture images of the surgical field of view. As a result, a worker relying on augmented reality and other display devices that are displaying images of the captured surgical field may not be able to view a portion of the instrument being actively used. Thus, the staff may not be able to accurately perceive the important attributes of the instrument. For example, a surgeon performing transection using an endocutter may not be able to view a portion of the endocutter as it passes out of view of the endoscope. Because of the obstructed view, the surgeon may not be able to perceive the range of articulation of the end effector of the endocutter. Alternatively, in another example, the surgeon may not be able to perceive the position of the end effector. Thus, it may be difficult for a surgeon to accurately perform a transaction. Accordingly, there is a need for apparatus, systems, and methods for tracking attributes of surgical instruments that are outside of the field of view of an imaging device and displaying the tracked attributes using overlays on an augmented reality device and other display devices.
Furthermore, at different times during the surgical procedure, surgical instruments and other objects may be out of view of the various imaging devices. Thus, staff relying on augmented reality and other display devices may not be able to perceive potential interactions of the surgical instrument with other objects outside the field of view. For example, a surgeon may use an energy device for the surgical procedure. However, viewing a display device that displays a live image of the surgical field captured by the endoscope, the surgeon may not be able to perceive that the energy device is very close to a metallic instrument outside the field of view of the endoscope. As a result, there is a risk that the surgeon may activate the energy device in close proximity to the metallic instrument, resulting in a malfunction (e.g., an arc) of the energy device. As another example, a surgeon attempting to perform a procedure using a circular stapler may be able to view the device platform of the stapler within the field of view of the imaging device, but not the anvil thereof outside of the field of view. Thus, it may be difficult for a surgeon to route and manipulate tissue to optimize the attachment of the device platform and anvil. As yet another example, a surgeon operating the device may not be able to perceive potential collisions or unintended interactions between the surgical instrument and objects outside of the field of view of the imaging device. Accordingly, there is a need for apparatus, systems, and methods for predicting interactions of a surgical instrument with objects outside of the field of view of an imaging device and displaying properties of the surgical instrument related to the potential interactions.
Real-time position tracking of objects
In various aspects, disclosed herein are devices, systems, and methods for tracking multiple users and objects within an OR. In some aspects, these devices, systems, and methods for tracking multiple users and objects may be employed to ensure that relevant information related to the tracked objects may be displayed to a particular user using the various augmented reality and other display devices disclosed herein.
Fig. 12 illustrates an example of a surgical system 15000 for tracking the position of objects within an OR in accordance with several non-limiting aspects of the present disclosure. The surgical system 15000 may include a surgical hub 15002 in communication with the tracking system 15006 and the at least one AR device 66. In some aspects, the tracking system 15006 may include a visualization system 15008. The surgical system 15000, surgical hub 15002, and visualization system 15008 may be similar in many respects to any of the surgical systems, surgical hubs, and visualization systems described above (e.g., surgical systems 1, 2, 50, 52; surgical hubs 6, 56, 5104; visualization systems 8, 58), respectively. In the non-limiting aspect of fig. 12, the tracking system 15006 communicates with a surgical hub 15002. In other aspects, the surgical hub can include a module that includes the tracking system 15006. The surgical hub 15002 may include an operating room mapping module 15004 configured to map the position and/OR status of various objects within the OR based on data received from the tracking system 15006, as discussed in more detail below.
The tracking system 15006 may be configured to track the location and/OR other attributes of various objects within the OR based on one OR more different types of tracking methods. In one aspect, the tracking system 15006 (and/or the visualization system 15008) may include one or more imaging devices 15010. The imaging device 15010 may be similar in many respects to the imaging devices 24, 96, the AR device 66, and/or other imaging sensors described above with respect to the visualization system 8. Thus, the imaging device 15010 may include a camera as well as other types of visual and non-visual sensors for capturing images OR otherwise tracking objects within the OR. For example, the imaging device 15010 may employ visual, infrared, and/OR other higher wavelength image recognition techniques to establish the positional movement and location of objects within the OR. The imaging device 15010 may be placed in multiple locations where the fields of view of the entire operating room overlap so that images of objects within the OR may be captured and tracked from multiple angles. Further, multiple imaging devices 15010 may be implemented such that when an object leaves the field of view of a first imaging device 15010 (e.g., a first camera 15010), the object in the OR may be tracked by at least a second imaging device 15010 (e.g., a second camera 15010).
In another aspect, the tracking system 15006 may include one OR more structured light sensors 15012 (e.g., structured light scanners) configured to be able to track objects in the OR. The structured light sensor 15012 may be configured to be capable of projecting a defined light pattern, for example, from multiple angles, in order to triangulate the position of an object within the OR based on the distortion of the pattern caused by the object. The structured light sensor 15012 may be similar to and/or include devices such as microsoft's Kinect, intel F200, intel R200, and/or Occipital Structure.
In another aspect, the tracking system 15006 may include one OR more LIDAR (light detection and ranging) sensors 15014 configured to be able to track objects in the OR. In other aspects, the tracking system 15006 may employ sensors that use LIDAR-like technology to track objects in the OR.
In another aspect, the tracking system 15006 may include one OR more floor sensors 15016 configured to track objects in the OR. The floor sensor 15016 may comprise a weight sensor. In one aspect, the tracking system may include an array of floor sensors 15016 configured to be able to determine where to place the device within the OR. For example, referring now to fig. 2 and 12, the floor sensor 15016 may be configured to determine the location and/or position of the operating table 14, the surgeon's console 18, the robotic hub 22, the side cart 20, etc. In another aspect, the floor sensor 15016 may be configured to monitor the location and/OR position of personnel within the OR.
The data generated by the floor sensor 15006 may be processed by the surgical hub 15002 and a determination may be made regarding various aspects of the surgical procedure. For example, the weight measured by the floor sensor 15006 may be used to determine whether the device has been placed on a side cart OR other component of the OR apparatus (e.g., by identifying a change in weight when the device is placed). As another example, floor sensor 15006 may be used to detect fatigue of OR workers based on movement (e.g., sway, weight distribution, etc.) of the OR workers. As another example, a floor sensor may be used to track the weight of a patient during surgery. Throughout the procedure, patient weight may be verified by the surgical hub 15002 for various reasons, such as to ensure that the dosage of medication administered to the patient is within an acceptable range of the patient's weight, as tracked by the tracking system 15006. As yet another example, the floor sensor 15016 may be used to track medical waste, devices, equipment, etc. that fall to the floor during surgery.
Referring still to fig. 12, in another aspect, the tracking system 15006 may include one OR more acoustic sensors 15018 configured to be capable of monitoring the position, location, and/OR movement of objects in the OR. For example, acoustic sensor 15018 can employ audio beacon technology, phase coherent tracking, and/OR time-of-flight triangulation to establish the positional movement of objects in the OR.
In another aspect, the tracking system 15006 may include one or more fiducial markers 15020. In one aspect, the fiducial markers 15020 may be any type of markers configured to facilitate the location, position, and/or movement of the tracked object relative to the field of view of the imaging device 15010 and/or relative to the location, position, and/or movement data tracked by any of the other devices/sensors of the tracking system 15006. For example, the fiducial marker 15020 may include an RFID (radio frequency identification) chip configured to be able to track the location and/or position of an object to which the RFID chip is attached. Thus, in some aspects, fiducial markers 15020 may be placed in and/OR on surgical equipment, operating room equipment, objects worn by OR staff, OR any other object that may be tracked by tracking system 15006. In some aspects, the initiation of tracking of the fiducial mark 15020 by the tracking system 15006 may be triggered based on the occurrence of an event, such as the removal of an object (e.g., device) including the fiducial mark 15020 from its packaging, the insertion of a battery into an object (e.g., device) including the fiducial mark 15020, and/OR when an object including the fiducial mark 15020 enters an OR. The fiducial markers 15020 may be used to help generate an augmented reality overlay, as discussed in more detail below.
In another aspect, the tracking system 15006 may include one OR more user/device sensors 15022 configured to identify and monitor the position, location, and/OR movement of OR personnel and/OR devices within an OR. In one aspect, the user/device sensor 15022 may be included in a device OR apparatus worn by an OR staff member. The user/device sensors 15022 may include, for example, accelerometers, gyroscopes, and/OR magnetometers to track the three-dimensional movement of the OR staff and/OR the device. In other aspects, the user/device sensor 15022 may comprise an RFID wristband worn by an OR staff member. In one aspect, data from the user/device sensors 15022 may be used by the surgical hub 15002 and/OR tracking system 15006 to associate a device (e.g., a surgical instrument) to a particular user within an OR at a given time during a surgical procedure. For example, the tracking system 15006 and/OR surgical hub 15002 may be configured to track the movement of an OR staff member and device, respectively, using a plurality of user/device sensors 15022. When the tracking system 15006 detects that a user/device sensor 15022 worn by an OR operator is proximate to a user/device sensor 15022 associated with a surgical instrument, the surgical hub 15006 may identify that the OR operator is associated with the surgical instrument (e.g., using a surgical instrument coupling). Based on the identified association of the staff member and the instrument, the surgical hub 15002 may cause an augmented reality overlay to be generated for the staff member and/or the surgical instrument, as will be described in more detail below with respect to fig. 12-14. In another aspect, the user/device sensor 15022 may be used to identify instruments and/or devices. The user/device sensor 15022 may include an RFID tag to identify the particular type of device being used during surgery. As one example, the user/device sensor 15022 may be an RFID tag on the trocar to identify the type of trocar being used.
In another aspect, the tracking system 15006 may include one OR more GPS (global positioning system) sensors 15024 that are tracked using GPS (e.g., satellite) tracking technology to monitor the position, location, and/OR movement of objects in the OR. It should be noted that while the tracking system 15006 of fig. 12 is shown as implementing tracking techniques including the imaging device 15010, the structured light sensor 15012, the LIDAR sensor 15014, the floor sensor 15016, the acoustic sensor 15018, the fiducial markers 15020, the user/device sensor 15022, and the GPS15024, the tracking system 15006 may be configured to be able to use any combination of these techniques (e.g., including only some of these techniques). Further, in some aspects, other tracking techniques configured to be able to track the location, position, movement, and/OR other attributes of objects in the OR may be implemented by tracking system 15006.
Fig. 13 shows a schematic side view of an exemplary embodiment of a tracking system 15006 in an Operating Room (OR) 15050 in accordance with several non-limiting aspects of the present disclosure. OR 15050 may be any part of OR. For example, in some aspects, fig. 13 may show an overall side view of OR 15050. In other aspects, fig. 13 may show a cross-sectional side view of a surgical field. Tracking system 15006 may include a first tracking device 15054 and a second tracking device 15056. The first tracking device 15056 and/or the second tracking device 15054 may be implemented using any combination of the tracking techniques mentioned above with respect to fig. 12 (e.g., imaging device 15010, structured light sensor 15012, LIDAR sensor 15014, acoustic sensor 15018, etc.). The first tracking device 15056 and the second tracking device 15054 may implement the same tracking technique or different tracking techniques.
Referring primarily to fig. 13, and also to fig. 12, in some aspects, the first tracking device 15054 and/or the second tracking device 15056 may be configured to be capable of tracking the first portion 15058 and the second portion 15060 of the target object 15062. The target object 15062 may be an object OR region, such as an object within a surgical field, a region within a sterile barrier, a patient, tissue, OR equipment, a surgical device, OR staff, etc. The first tracking device 15054 is capable of directly tracking 15055A, 15055B the first portion 15058 and the second portion 15060 of the target object 15062. For example, the first tracking device 15054 may be a camera (e.g., imaging device 15010) in which the first portion 15058 and the second portion 15060 of the target object 15062 are tracked 15055A, 15055B directly in the field of view of the camera. The second tracking device 15056 is capable of directly tracking 15057A the first portion 15058 of the target object 15062. However, the second tracking device 15056 may not be able to track 15057B the second portion 15060. For example, the second tracking device may be a camera (e.g., imaging device 15010), and the second portion 15060 may be outside the field of view of the camera. This may be because the occluding object 15064 (e.g., OR staff, surgical instrument, tissue, etc.) is blocking tracking 15057B of the second portion 15060 of the target object 15062. Although the occluding object 15064 blocks the tracking 15057B of the second portion 15060 of the target object 15062 by the second tracking device 15056, the tracking system 15006 may still track the second portion 15060 because the first tracking device 15054 directly tracks 15055B the second portion 15060. Thus, the tracking system 15060 may be configured to have tracking devices that track overlapping tracking areas (e.g., multiple imaging devices/systems with overlapping fields of view) such that a target object may be tracked when the target object or a portion of the target object is outside of the field of view of one of the tracking devices. Thus, the surgical hub 15002 can cause the display device to display information (e.g., images, overlays, notifications, etc.) related to the target object even when the target object or a portion thereof is outside the field of view of one of the tracking devices.
Still referring primarily to fig. 13, and also to fig. 12, in other aspects, the first tracking device 15054 and/or the second tracking device 15056 may comprise tracking devices configured to be capable of directly tracking 15055A, 15055B, 15057A and/or indirectly tracking 15057C, 15057D the first portion 15058 and the second portion 15060 of the target object 15062. For example, the second tracking device 15056 may be a device (e.g., acoustic sensor 15018) that implements a reflection tracking technique such that when the occluding object 15064 inhibits direct tracking 15057B, the second tracking device 15056 may indirectly track 15057C, 15057D the target object 15062 (e.g., based on using the reflection of the object 15066). Thus, the tracking system 15060 may be configured as a tracking device capable of having tracking areas with overlapping tracking areas (e.g., including areas tracked using non-image technology, reflection technology, audio beacons, GPS, RFID, etc.) to track the location, position, movement, and/or other attributes of a target object.
Fig. 14 illustrates a schematic plan view of an exemplary operating room map 15070 generated by the operating room mapping module 15004 of fig. 12 in accordance with at least one non-limiting aspect of the present disclosure. Referring primarily to fig. 14, and also to fig. 12-15, the tracking system 15006 may send tracking data generated using any combination of tracking techniques (e.g., imaging device 15010, structured light sensor 15012, LIDAR sensor 15014, floor sensor 15016, acoustic sensor 15018, fiducial markers 15020, user/device sensors 15022, and GPS 15024) to the operating room mapping module 15004 of the surgical hub 15002. Based on the tracking data, the operating room mapping module 15004 may generate an operating room map 15070 of the operating room 15060. Map 15070 may include information related to the positioning, location, movement, and/or other attributes of a plurality of objects (e.g., 15072A-15072L) within operating room 15060. For example, the objects 15072A-15072L of the map 15070 may correspond to various devices, equipment, OR staff present in an operating room. The map 15070 may be updated in real-time based on tracking data from the tracking system 15006. In some aspects, the map 15070 may be displayed by any of the display devices disclosed herein. In other aspects, the surgical hub 15002 may determine the proximity and/or interaction of objects based on the map 15070.
Multi-user tracking and information correlation
Referring to fig. 12-14, in other aspects, augmented reality overlays and other notifications and warnings can be generated based on the location, position, and/or movement of objects (e.g., 15072A-15072L) as determined by the operating room mapping module 15004. In some aspects, the tracking system 15006 may be used to track users and devices based on the user/device sensors 15022 and other sensing technologies (e.g., imaging device 15010, structured light sensor 15012, LIDAR sensor 15014, floor sensor 15016, acoustic sensor 15018, GPS15024, etc.). In some aspects, as described above, the surgical hub 15002 (e.g., the operating room mapping module 15004) and/OR tracking system 15006 may use the user and device sensor data to associate a device (e.g., a surgical instrument) to a particular user within the OR at a given time during a surgical procedure. In this regard, the tracking system 15006 and operating room mapping module 15004 may use both active (wearable) and passive (camera) tracking methods to map 15070 the locations of devices and staff within the OR suite. For example, the user (surgeon and OR other staff) may wear gloves that include user/device sensors 15022. In one aspect, the surgical hub 15002 may be configured to recognize the glove as being coupled to the right and left hands of the user.
In some aspects, the operating room mapping module 15004 may be configured to be able to associate a user with a particular location within the operating room map 15070. For example, the operating room map 15070 may divide the OR 15060 into specific regions. Based on data from the tracking system 15006, the operating room mapping module 15004 may identify users with prioritization and/or control of devices. When a user exchanges devices (e.g., transfers physical control of the device to a different user), a sensor (e.g., device/user sensor 15022) may detect the exchange, thereby enabling the surgical hub 15002 to identify which user is associated with the device. For example, a glove worn by a user may be configured with sensors that track finger pressure and position. The surgical hub 15002 (e.g., the operating room mapping module 15004) may determine who has control over the device based on glove sensor data, device sensor data, and/or data from the imaging device 15010 to calculate the likelihood of who is guiding the device.
In various aspects, the surgical hub 15002 may cause any of the displays and/or AR devices described herein (displays 7, 9, 19; AR devices 66, 84) to display notifications, warnings, and/or overlays based on data from the tracking system 15006, the operating room mapping module 15004, and/or the surgical hub 15002. In one aspect, a notification may be displayed to one or more users based on the operating room mapping module 15004 determining that the user has control over the surgical instrument. For example, the AR device being worn by the surgeon may display a notification indicating that the surgeon has controlled the surgical instrument presented to the surgeon by another OR staff member. As another example, both the first AR device being worn by the surgeon and the second AR device being worn by the OR staff member may display a notification indicating that the surgeon has taken control of the surgical instrument from the OR staff member. In some aspects, whether the AR device of a particular user displays a notification may be based on a priority level associated with data tracked by the tracking system 15006 and/or information determined by the surgical hub 15002.
In one aspect, the surgical device may be intended for simultaneous use by multiple users. In this regard, portions (e.g., segments) of the surgical device may be individually tracked by the tracking system 15006. For example, a circular suturing device may include a device portion having a retractable trocar controllable by an adjustable knob. The circular stapling device may further comprise an attachable anvil portion. Different portions of the circular stapling apparatus may be controlled by different users and tracked separately by the tracking system 15006. Based on data from the tracking system 15006, a display device (e.g., an AR device) associated with each user may be configured to be able to display different overlays based on the portion of the device that the user has control over. For example, when the user adjusts the knob, a display associated with the user controlling the adjustable trocar may be displayed superimposed based on the dialing pressure. A display associated with both users may display the status of the anvil attached to the trocar.
In some aspects, the user may not wear a trackable sensor (e.g., user/device sensor 15022). The tracking system 15006 may be configured to enable tracking of actions of a user and/or a device controlled by the user using passive tracking (e.g., using the imaging device 15010). For example, the nurse may not wear the trackable sensor. The nurse may reload the endocutter with a staple cartridge. The particular bin type may be color coded. Reload exchanges performed by the nurse may be detected based on the camera of the tracking system. Further, a notification based on the reload exchange detected by the camera may be displayed to the surgical hub 15002 as the user who was determined to last use the device (e.g., the surgeon wearing the active tracking device). The notification may include an overlay indicating the reloaded bin type based on the color of the bin detected by the camera of the tracking system 15006. Such tracking may enable detection of potential errors, such as loading an incorrect type of staple cartridge into the endocutter. Such tracking may also enable alerting (e.g., displaying a notification) to be issued based on these detections. Further, such tracking may provide the user with awareness of actions that the user cannot directly observe.
Fig. 15 is a table 15100 of exemplary tracked object interactions determined by the surgical hub 15002 based on data generated by the tracking system 15006 in accordance with at least one non-limiting aspect of the present disclosure. Each tracked object interaction 15102 is associated with a timestamp 15104 and the object's position 15106 during the interaction. Where applicable, the object interactions are associated with glove ID 15108 and device ID 15110 based on the user and/or device involved in the object interactions (e.g., as determined based on user/device sensor 15022, imaging device 15010, etc.). In the case of tracking multiple parts of a device, the object interaction also includes a device part identification 15112. The table also includes a type of action 15114 determined by the surgical hub 15002 based on the tracked information associated with the subject interaction. The table also includes a tracking source 15116 (e.g., imaging device 15010 (camera); user/device sensor 15022 (wearable)). Thus, based on data from the tracking system 15006, the surgical hub 15006 can identify various actions that occur during a surgical procedure, such as device switching, device operation, and the like.
In various aspects, the information shown in the example table of fig. 15 may be stored in a non-relational database or in a format such as JSON array, allowing additional information to be associated with an entry (e.g., stored by storage 5 of fig. 1; storage 55 of fig. 5, etc.). For example, the detected object interactions may be entered into an event log that includes various device-related information such as time stamp, device location/rotation, equipment/device ID, and determined actions. In some aspects, this information may be associated with device sensing data (e.g., tracked force/stress, finger position, and other relevant diagnostic data).
Off-screen interaction with a rendered image created based on predictions
In various aspects, disclosed herein are apparatuses, systems, and methods for tracking properties of surgical instruments outside of a field of view of an imaging device and displaying the tracked properties using superposition on an augmented reality device and other display devices. Referring again to fig. 12, the surgical system 15000 may include a tracking system 15006 configured to visualize and/or track various objects within an operating room. As described above, in some aspects, the tracking system 15006 may include a visualization system 15008 and one or more imaging devices 15010. The visualization system 15008 may be similar in many respects to the visualization systems 8, 58 described herein, and the imaging device 15010 may be similar in many respects to the imaging devices 24, 96, the AR device 66, and/or other imaging sensors described herein. Thus, the visualization system 15008 may be configured to be able to capture images of a surgical field (e.g., live feed) during a surgical procedure. For example, when the surgical instrument is used to perform a surgical procedure, the visualization system may capture an image of the surgical instrument in the surgical field. The images captured by the visualization system 15008 may be displayed by any of the display devices disclosed herein, such as an Augmented Reality (AR) display device, to assist the surgical staff during the surgical procedure.
As also described above, in some aspects, the tracking system 15006 may utilize the fiducial markers 15020 to track various properties of the surgical device. The fiducial markers 15020 may be any type of markers configured to facilitate tracking of the position, location, and/or movement of the object relative to the field of view of the imaging device 15010 and/or relative to the position, location, and/or movement detected by other sensors/devices of the tracking system 15006. For example, the fiducial marker 15020 may include an RFID (radio frequency identification) chip configured to be able to track the location and/or position of an object to which the RFID chip is attached. Thus, in some aspects, fiducial markers 15020 may be placed in and/OR on surgical equipment, operating room equipment, objects worn by OR staff, OR any other object that may be tracked by tracking system 15006.
In various aspects, the surgical hub 15002 may be configured to enable a display (e.g., the AR device 66) of the surgical system 15000 to display a captured image based on the object in the surgical field of view of the imaging device 15010 that is superimposed with a graphic representing the attribute of the object determined based on the fiducial markers 15020. In some aspects, fiducial markers 15020 may be included on/in the surgical instrument. Based on the fiducial markers, the tracking system 15006 may be configured to identify the type of surgical instrument and/or various other attributes of the surgical instrument associated with the fiducial markers 15020.
In one aspect, the tracking system 15006 may detect the position and orientation of the fiducial marker 15020. The fiducial marker may be located on a first portion of the surgical instrument. Based on the detected position and orientation of the fiducial marker 15020, the surgical hub 15002 may determine the position and orientation of the second portion of the surgical instrument relative to the image of the surgical field captured by the imaging device 15010. Accordingly, the surgical hub 15002 may cause the AR device 66 to display a graphic related to the position and orientation of the second portion of the surgical instrument based on the fiducial markers 15020, the graphic superimposed over the image of the surgical field of view. In another aspect, the second portion of the surgical instrument may be outside the field of view of the imaging device 15010. Therefore, the second portion of the surgical instrument cannot be viewed based solely on the image captured by the imaging device 15010. In this regard, a graphic associated with the second portion of the surgical instrument may be displayed by the AR device 66 as an overlay representing the position and orientation of the second portion of the surgical instrument. Thus, a user viewing the AR device 66 may perceive the position and orientation of the second portion of the surgical instrument, even when that portion of the surgical instrument is outside the field of view of the imaging device 15010.
For example, the endocutter may include fiducial markers 15020 on the endocutter's handle. Based on the fiducial markers 15020, the surgical hub 15002 may determine the position of the end effector or endocutter. A surgeon viewing an image of the surgical field captured by imaging device 15010 using AR device 66 may operate the endocutter. The end effector may not be in the field of view of the imaging device 15010. Thus, to assist the surgeon in perceiving the position and orientation of the end effector, the AR device 66 may display graphics related to the position and orientation of the end effector. The graphic may be, for example, a rendered image of the end effector or a graphical object directed to the position of the end effector.
In some aspects, the tracking system 15006 and/or the surgical hub 15002 may determine various other attributes of the object based on the fiducial markers 15020. In one aspect, the fiducial marker 15020 may be associated with a surgical instrument having a defined range of motion (e.g., a defined operating volume and/or area, a range of articulation, a range of rotation, etc.). Thus, based on the fiducial markers 15020, the surgical hub 15002 may determine the range of motion of the instrument relative to the image of the surgical field. For example, the endocutters mentioned in the above paragraphs may have a range of articulation and/or a range of rotational motion. Thus, based on the tracking of the fiducial markers 15020, the surgical hub 15002 may determine the range of motion of the endocutter and display a superimposed graphic representing the determined range of motion of the image relative to the surgical field.
In some aspects, the tracking system 15006 and/or the surgical hub 15002 may use the fiducial markers 15020 to verify an identity of the subject, such as an identity and/or type of surgical instrument. For example, as described above, the surgical instrument may be communicatively coupled to a surgical hub (e.g., device/instrument 21 and surgical hub 56 of fig. 5). Based on the connection, the surgical hub can be configured to identify the instrument. The identification may include the type of instrument (e.g., 45mm stapler, 60mm stapler, etc.). Thus, the tracking system 15006 and fiducial markers 15020 may be used as a means to identify substitutes and/or redundancies of surgical instruments. In one aspect, identifying the instrument based on the fiducial markers 15020 may include determining whether the instrument is valid and/or available.
In some aspects, the tracking system 15006 and/OR the surgical hub 15002 may use the fiducial markers 15020 as markers within the OR to provide zero reference points. In another aspect, fiducial markers 15020 may be positioned at various locations around the OR to provide a frame of reference that may be used by the tracking system 15006 and/OR the surgical hub 15002 to orient other objects tracked by the tracking system 15006. For example, fiducial markers 15020 may be placed on the patient and/or the instrument to determine the relative position of the patient and instrument to each other. As another example, fiducial markers 15020 may be placed on the instruments to determine the proximity and/or relative distance of the instruments to each other. As yet another example, fiducial markers 15020 may be placed on instruments and other devices in the operating room, such as a table or cart, to determine if an instrument has been placed on the table.
In some aspects, the tracking system 15006 and/or the surgical hub 15002 may use the fiducial markers 15020 to detect potential incidents and/or security issues related to movement of the subject. The detected potential incidents and/or security problems may be displayed as notifications via AR overlays. In one aspect, fiducial markers 15020 may be positioned at various portions to designate zero reference points and/or safe areas. The tracking system 15006 may be configured to detect when an object is near a safe area or when an object is outside of a safe area. For example, the surgical procedure may involve the use of a robotic system in conjunction with laparoscopic instruments. Fiducial markers 15020 may be positioned to designate a safe area within which the robotic system may safely maneuver laparoscopic instruments. The tracking system 15006 and/or the surgical hub 15002 may be configured to identify that the laparoscopic instrument is outside a safe area and provide alerts to the user and/or adjust the operation of the robotic system.
Fig. 16A and 16B illustrate exemplary intraoperative displays 15200 including images of a surgical instrument 15204 in a surgical field of view 15202 and graphics 15212, 15214 representing a portion 15210 of the surgical instrument 15204 outside the field of view in accordance with at least one non-limiting aspect of the present disclosure. The intraoperative display 15002 may be displayed by any of the display devices disclosed herein (e.g., AR device 66). Referring primarily to fig. 16A and 16B and fig. 12, an image of the surgical field 15202 may be captured by the imaging device 15010. Based on the image of the surgical field 15202 captured by the imaging device 15010, a first portion 15208 (e.g., grasper portion) of the surgical instrument 15204 can be seen to interact with the tissue 15206. However, the second portion 15210 (e.g., shaft portion) of the surgical instrument 15204 is outside the field of view of the imaging device 15010 and, therefore, is not visible in the image of the surgical field 15202. The surgical instrument 15204 may include fiducial markers 15020 (not shown in fig. 16A and 16B). Based on the fiducial markers 15020, the tracking system 15006 and/or the surgical hub 15002 may determine the position of the second portion 15210 of the surgical instrument 15204 relative to the surgical field. Thus, the surgical hub 15002 can cause the intraoperative display 15200 to include graphics 15212, 15214 representing the position of the second portion 15210. In the non-limiting aspect of fig. 16A, the graphic 15212 is a rendered image 15212 of the second portion 15210 of the surgical instrument 15204. In the non-limiting aspect of fig. 16B, the graphic 15214 is a graphic object representing the position of the second portion 15210 of the surgical instrument 15204 relative to the image of the surgical field 15202.
Accordingly, the surgical system 15000 can track the properties of the surgical instrument and display the tracked properties using a superposition displayed by the AR display device (e.g., AR device 66) and other display devices disclosed herein. The OR staff may rely on augmented reality and other display devices that display images of the surgical field captured by the imaging device. The surgical system 15000 may enable a worker to perceive portions of the surgical instrument that may be outside of the field of view of the imaging device. In addition, the surgical system 15000 may allow an OR staff member to more accurately perceive important attributes of the instrument that may not be viewable based on a single imaging device, such as the range of motion of the instrument and/OR the position of the instrument relative to other tracked objects.
Prediction of interactions and interrelationships of objects not in view
In various aspects, disclosed herein are devices, systems, and methods for predicting interactions of objects outside of a field of view of an imaging device and displaying properties of the objects based on the predicted interactions. Referring again to fig. 12, the surgical system 15000 may include a tracking system 15006 configured to visualize and/or track various objects within an operating room. As described above, in some aspects, the tracking system 15006 may include a visualization system 15008 and one or more imaging devices 15010. The visualization system 15008 may be similar in many respects to the visualization systems 8, 58 described above, and the imaging device 15010 may be similar in many respects to the imaging devices 24, 96, the AR device 66, and/or other imaging sensors described above. Thus, the visualization system 15008 may be configured to be able to capture images of a surgical field (e.g., live feed) during a surgical procedure. For example, the visualization system may capture an image of the surgical instrument in the surgical field while the surgical instrument is performing the surgical procedure. The images captured by the visualization system 15008 may be displayed by any of the display devices disclosed herein, such as an Augmented Reality (AR) display device, to assist the surgical staff during the surgical procedure.
In some aspects, as described above, the surgical instrument may be communicatively connected to a surgical hub (e.g., device/instrument 21 may be connected to surgical hub 56 of fig. 5). Accordingly, the surgical hub 15002 may be configured to receive instrument data from the surgical instrument relating to various sensed parameters and operational settings of the instrument. Based on the instrument data received by the surgical hub 15002, the surgical hub 15002 may determine the operating parameters of the instrument. For example, based on instrument data received from the various surgical instruments disclosed herein, the surgical hub 15002 may determine operating parameters such as speed, force, firing speed, firing force, activation status, power level, activation time, energy pattern, and the like.
In some aspects, the surgical hub 15002 may be configured to identify interactions and potential interactions of the surgical instrument with other objects based on data from the tracking system 15006 and/or based on instrument data received from the surgical instrument. Furthermore, based solely on the images captured by the imaging device 15010 of the visualization system 15008, potential interactions of the surgical instrument with other objects may be imperceptible. Thus, a user relying on a display device (e.g., AR device 66) that displays only captured images from imaging device 15010 may not be able to accurately respond to potential interactions. Thus, to assist the user, the surgical hub 15002 can be configured to enable the display device to display various notifications and other graphical indicators (e.g., overlays) related to interactions and/or potential interactions detected by the surgical hub.
In one aspect, based on data from the tracking system 15006, the surgical hub 15002 may detect collisions or potential collisions of the tracked object. For example, using any combination of the various tracking techniques disclosed herein (e.g., imaging device 15010, structured light sensor 15012, LIDAR sensor 15014, floor sensor 15016, acoustic sensor 15018, fiducial markers 15020, user/device sensor 15022, and GPS 15024), the surgical hub 15002 may detect potential collisions between a portion of the surgical instrument and critical structures within the surgical field of view. As another example, the surgical hub 15002 may be configured to detect potential collisions between a plurality of surgical instruments. As yet another example, the surgical hub 15002 may be configured to detect potential collisions between various other objects in the surgical field. The detected potential collision and/OR the detected collision may not be within the field of view of the imaging device 15010 and, thus, may not be viewable by the OR staff. Based on the detected potential collision and/or the detected collision, the surgical hub 15002 may cause a display device (e.g., the AR device 66) to display a notification, such as a superposition of information related to the collision. In one aspect, the notification may include an alarm and/or other instructions for avoiding a conflict. In another aspect, the notification may include an overlay with a graphical representation of the object involved in the collision. Thus, an OR operator may perceive and take action on potential collisions and collisions that are not within the field of view of the imaging device 15010.
In another aspect, based on data from the tracking system 15006, the surgical hub 15002 may detect unintended interactions of the tracked object. For example, similar to detecting potential collisions, the surgical hub 15002 may detect unintended interactions between a portion of the surgical instrument and critical structures within the surgical field of view. As another example, the surgical hub 15002 may detect unintended interactions between multiple surgical instruments. As yet another example, the surgical hub 15002 may detect unintended interactions between various other objects in the surgical field. The detected unintended interactions may not be within the field of view of the imaging device 15010 and, thus, may not be viewable by the OR staff. Based on the detected unintended interactions, the surgical hub 15002 may cause a display device (e.g., the AR device 66) to display notifications, such as overlays of information related to the unintended interactions, alarms, and/or other instructions for avoiding interactions, and/or overlays of graphical representations of the objects involved in the interactions. Accordingly, the OR staff may perceive and take action on unintended interactions that are not within the field of view of the imaging device 15010. In some aspects, the surgical hub 15002 may prevent operation of the instrument based on the detected unintended interactions.
For example, the user may be using a monopolar energy device. The tracking system 15006 and/or the surgical hub 15002 may detect that the monopolar energy device is adjacent to a metallic object (e.g., another surgical instrument, object, etc. in the surgical field of view). The surgical hub 15002 may determine that there is a potential unintended interaction between the monopolar device and the metallic object because activating the monopolar device adjacent the metallic object may cause an arc. Based on the detected unintended interactions, the surgical hub 15002 may cause the AR display device 66 to display an overlaid alert of the interactions. In another aspect, the surgical hub 15002 may prevent activation of the monopolar energy device. In yet another aspect, the surgical hub 15002 can cause the overlay to be displayed, indicating to the user to redirect the energy direction to the intended treatment region.
In some aspects, notifications, alerts, and/or overlays displayed based on detected potential collisions between objects, detected collisions, detected unexpected interactions, and other detected interactions may include attributes of objects involved in the interactions. The attributes of the object may be based on instrument data received by the surgical hub 15002 and/or based on tracking data from the tracking system 15006. For example, the force, velocity, impact, and/or physical magnitude of interaction between objects may be displayed as an overlay. In other aspects, the notification, alert, and/or overlay may include a graphic indicating the location of the interaction. Thus, a user viewing the graphic may adjust the field of view of the imaging device to view the interaction.
In another aspect, based on data from the tracking system 15006, the surgical hub 15002 may cause the graphical overlay to be displayed by a display device (e.g., the AR device 66) to assist the user in performing the surgical procedure. In one aspect, the overlay may include a graphical representation and/or indicator for objects outside of the field of view of the imaging device 15010 that captured the image of the surgical field of view. The graphic overlay may provide information related to the location and/or other attributes of the object outside the field of view. For example, a surgeon may perform a procedure using a circular stapler. The procedure may involve attaching the device deck of the stapler to a separate anvil portion. The device deck of the stapler can be in the field of view, while the anvil can be out of view (e.g., out of view based on camera angle, out of view due to tissue obscuring the view of the anvil, etc.). The surgical hub 15002 can cause a display device to display a rendered image or other graphical representation of the anvil (e.g., indicating an off-image position of the anvil, superimposing the rendered image of the anvil on occlusion tissue, etc.). In another aspect, the surgical hub 15002 may cause the display device to display a directional indicator overlay that shows the direction and/or route in which tissue may be manipulated to optimize the attachment of the anvil to the device deck. Thus, the superposition may help the surgeon perceive how objects outside of the field of view of the imaging device 15010 may be manipulated to more easily achieve the desired result of the surgical procedure.
In another aspect, based on data from the tracking system 15006, the surgical hub 15002 may cause the graphic overlay to be displayed by a display device (e.g., the AR device 66), which may replace the need to use various instruments. For example, fig. 17A and 17B illustrate an exemplary intraoperative display 15300 showing a surgical field of view 15302 during cutting of gastric tissue 15306 in accordance with at least one non-limiting aspect of the present disclosure. The intraoperative display 15000 may be displayed by any of the display devices disclosed herein (e.g., AR device 66). Referring to fig. 17A, the bougie tube 15310 has been inserted into the patient to provide guidance when the surgeon forms a cut line 15308 in the stomach tissue 15306 using the endocutter 15312 with the assistance of the surgical grasper 15304. Referring now to fig. 17B, the bougie tube 15310 is no longer in use. Conversely, in one aspect, a graphical overlay of the virtual bougie 15316 may be displayed by the intraoperative display 15300 to provide guidance to the surgeon for cutting the stomach tissue 15306. In another aspect, a graphical overlay of the cut line guide 15314 may be displayed by the intraoperative display 15300 to provide guidance to the surgeon. Thus, the graphical overlays 15314, 15316 can replace the need to physically install the bougie tube 15310 within the patient.
Fig. 18 illustrates a method 15400 for mixed reality visualization of a surgical system in accordance with several non-limiting aspects of the present disclosure. The method 15400 may be implemented by any combination of a surgical system, surgical hub, tracking system, visualization system, augmentation system, AR device, any component thereof, and any other device and system disclosed herein (such as surgical systems 1, 2, 50, 52, 15000, surgical hub 6, 56, 5104, 15002, tracking system 15006, visualization system 8, 15008, communication system 63, augmentation system 83, and AR device 66, 84).
According to the method 15400, a first camera of a first visualization system may capture 15402 an image of an object in a surgical field of view, wherein a first portion of the object is outside of the field of view of the first camera. The tracking system may track 15404 the location of the second portion of the object. The surgical hub may determine 15406 a property of the object based on the tracked location of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera. The augmented reality display device may display 15408 a captured image of an object in the surgical field and a graphic based on the attributes of the object. In one aspect, the object may include a surgical instrument, patient tissue, a user, or a combination thereof.
According to one aspect of the method 15400, determining 15406 the attribute of the object based on the tracked location of the second portion of the object may include determining the location of the first portion of the object. Further, the display 15408 graphics may include a rendered image of the first portion of the augmented reality display device display object.
In another aspect of the method 15400, the tracking system may include a visualization system. Further, the visualization system may include a second camera. Tracking 15404 the location of the second portion of the object may include the second camera capturing an image of the second portion of the object. The second portion of the object may be outside the field of view of the first camera. In another aspect of the method 15400, tracking 15404 the location of the second portion of the object may include tracking the second portion of the object using structured light sensors, light detection and ranging (LIDAR) sensors, radio Frequency Identification (RFID), global Positioning System (GPS) tracking, audio beacons, non-visual light tracking, or a combination thereof.
In another aspect of the method 15400, a tracking system may track the location of structures in the surgical field. Further, determining 15406 the attribute of the object based on the tracked location of the second portion of the object may include the surgical hub identifying interaction of the first portion of the object with the structure. In one aspect, an augmented reality display device may display a graphic based on a location of a structure. In another aspect, displaying a graphic based on the location of the structure may include displaying a warning based on the identified interaction of the first portion of the object with the structure. In yet another aspect, displaying the graphic based on the location of the structure may include displaying a force of interaction, a speed of interaction, an indication of an impact of the first portion of the object with the structure, a power-on state of the object, a time, or a combination thereof.
In another aspect of the method 15400, the object may include a surgical instrument including a fiducial marker. In this aspect, tracking 15404 the position of the second portion of the object may include the tracking system tracking fiducial markers. In another aspect, the surgical hub can determine a range of motion of the surgical instrument based on the tracked fiducial markers. Further, displaying 15408 the graphic may include the augmented reality display device displaying a rendered image representing a range of motion of the surgical instrument.
Use of specific information and management devices across a network display device
As described throughout this disclosure, various devices and instruments may be used to perform surgical procedures. These devices can vary widely. For example, the devices may have different device types and different device versions, each with different features and intended uses. In some cases, the characteristics and intended use of the device may be updated by the device manufacturer. In addition, device manufacturers may develop new technologies for existing devices or release software updates related to device operation. In other cases, the device may be recalled by the manufacturer. In other cases, there may be counterfeit devices or counterfeit device components that should not be used. Thus, there is a large amount of information about the device identification that the OR staff needs to know when using the device for surgery.
In addition, there are a large number of device operation related information that the OR staff must consider when using the device. For example, device performance may degrade over time based on repeated use. As another example, the device may be overused or misused during a surgical procedure. Further, the device may sense information that the user may not know or understand how to easily access. Accordingly, there is a need for an apparatus, system, and method for managing device-related information and for allowing a user to easily access information related to a related device.
In various aspects, disclosed herein are apparatuses, systems, and methods for managing device-related information. As described above, the device and surgical instrument may be communicatively coupled to a surgical hub (e.g., the device/instrument 21 may be coupled to the surgical hub 56 of FIG. 5). Accordingly, the surgical hub may be configured to receive device-related information from various devices used with various surgical systems. Further, the surgical hub may be communicatively coupled to a hospital network and/or a device manufacturer's network. For example, referring to fig. 5, a computer-implemented interactive surgical system 50 may include one or more surgical systems 52 including at least one surgical hub 56 in communication with a cloud 54, which may include a remote server 63. In one aspect, the cloud 54 and/or remote server 63 may be associated with a hospital network. The hospital network may be in communication with a device manufacturer database. In another aspect, cloud 54 and/or remote server 63 may be associated with a device manufacturer database.
In some aspects, devices/instruments 21 connected to the surgical hub 56 may be authenticated based on communication with a hospital network and/or a device manufacturer database. The hospital network may be configured to be able to determine whether the connected device/appliance 21 is authorized. For example, counterfeit devices attempting to connect to the surgical hub 56 may be unauthorized. In one aspect, the hospital network may communicate with a manufacturer database to determine that the counterfeit device is unauthorized. As another example, recall devices attempting to connect to the surgical hub 56 may be unauthorized. Unauthorized devices/instruments 21 may be prevented from being used by, for example, surgical hub 56.
In one aspect, authorization of the device/instrument 21 of the device may be verified during surgery. In another aspect, authorization of the device/instrument may be verified when the device/instrument 21 and/or components of the device (e.g., reload cartridge, replacement component) are stored. In yet another aspect, the surgical hub 56 may be configured to allow the procedure to proceed even if the device/instrument 21 is not yet authorized. For example, if there is a lack of authorization due to a hospital network outage, the procedure may be allowed to proceed without device/instrument authorization.
In some aspects, the connected device/appliance 21 may store information related to technology, intended use, and/or software updates for using the device. This information may be transmitted to the surgical hub 56 and stored by a hospital network (e.g., server 63). In other aspects, information related to technology, intended use, and/or software updates for using the device may be accessed on a database of the device manufacturer after the device is connected. Instructions and/or intended use of the device/instrument may be presented to a user of the device via a display device (e.g., AR device 66).
In some aspects, the hospital network and/or device manufacturer database may store information related to recommended and/or expected device usage. The device usage information may be used to determine whether a particular device/instrument 21 has exceeded a recommended use. For example, the device usage information may include a maximum recommended usage during a particular time period (e.g., the device may not be intended to be used beyond a specified time period, the device may not be intended to be activated beyond a specified number of times within a particular time period, etc.). As another example, the device usage information may include a maximum recommended number of activations and/or a maximum usage time during a lifetime of the device. As another example, the device usage information may include the intended use of the particular device/instrument 21. Based on the device usage information, the surgical hub 56 may be configured to alert the user (e.g., via a display device such as the AR device 66) to detected over-use and/or misuse. In other aspects, the surgical hub 56 may be configured to prevent further use of the device based on device usage information stored by the hospital network and/or the device manufacturer database.
In various aspects, disclosed herein are apparatuses, systems, and methods for allowing a user to easily access information related to a related device. Still referring to fig. 5, and also referring to fig. 8, a user using the device/instrument 21 connected to the surgical hub 56 may request a display device (e.g., hub display 65, instrument display 50, AR device 66, etc.) to display device information related to the device/instrument 21. For example, the surgeon may provide verbal cues detected by a microphone associated with the surgical hub 56 to request display device information (e.g., the surgeon may say "display information to me"). The user's request to display device information may cause the surgical hub 56 to cause the display device to display information related to the device/instrument 21.
In various aspects, the displayed information related to the device/appliance 21 may include information related to historical use of the device and other information related to the current operation of the device. In some aspects, the information displayed may vary depending on the type of device. In one aspect, if the device/instrument 21 is an energy device, the information displayed may include, for example, the number of activations, total time of activation, residual temperature at the jaws, estimated wear conditions of the device, device specific calibration and/or characterization information that may affect optimal use, device parameters of interest, or a combination thereof. On the other hand, if the device/instrument 21 is an endocutter, the information displayed may include, for example, the number of firings, estimated anvil resection information, the high/low of tissue gap based on firings on the test skin under construction, device parameters of interest (e.g., maximum articulation angle, jaw temperature, etc.), or a combination thereof. Thus, a user of the device can easily determine whether the device is approaching or has exceeded its recommended lifetime (e.g., based on the displayed usage history). The user can also easily access important parameters related to the operation of the device to assist in making decisions during surgery.
Various additional aspects of the subject matter described herein are set forth in the following numbered embodiments:
example 1: a method for mixed reality visualization of a surgical system, the method comprising: capturing, by a first camera of a first visualization system, an image of an object in a surgical field of view, wherein a first portion of the object is located outside of the field of view of the first camera; tracking, by a tracking system, a position of a second portion of the object; determining, by a surgical hub, a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and displaying, by an augmented reality display device, a captured image of the object in the surgical field and a graphic based on the attribute of the object; wherein the object comprises a surgical instrument, patient tissue, or a user, or a combination thereof.
Example 2: the method of embodiment 1, wherein determining the attribute of the object based on the tracked position of the second portion of the object comprises: determining a location of the first portion of the object; and wherein displaying the graphic comprises displaying, by the augmented reality display device, a rendered image of the first portion of the object.
Example 3: the method of any of embodiments 1-2, wherein the tracking system comprises the visualization system; wherein the visualization system comprises a second camera; wherein tracking the location of the second portion of the object comprises capturing, by the second camera, an image of the second portion of the object; and wherein the second portion of the object is located outside the field of view of the first camera.
Example 4: the method of any of embodiments 1-3, wherein tracking, by the tracking system, the location of the second portion of the object comprises tracking the second portion of the object using a structured light sensor, a light detection and ranging (LIDAR) sensor, radio Frequency Identification (RFID), global Positioning System (GPS) tracking, an audio beacon, non-visual light tracking, or a combination thereof.
Example 5: the method of embodiments 1-4, further comprising tracking, by the tracking system, a location of a structure in the surgical field of view; wherein determining the attribute of the object based on the tracked position of the second portion of the object includes identifying, by the surgical hub, interaction of the first portion of the object with the structure.
Example 6: the method of any one of embodiments 1-5, further comprising displaying, by the augmented reality display device, a graphic based on the location of the structure.
Example 7: the method of any of embodiments 1-6, wherein displaying the graphic based on the attribute of the object includes displaying, by the augmented reality display device, a warning based on the identified interaction.
Example 8: the method of any of embodiments 1-7, wherein displaying the graphic based on the attribute of the object comprises displaying, by the augmented reality display device, a force of the interaction, a speed of the interaction, an indication of an impact of the first portion of the object with the structure, a power-on state of the object, a time, or a combination thereof.
Example 9: the method of any of embodiments 1-8, wherein the object comprises a surgical instrument comprising a fiducial marker; and wherein tracking the position of the second portion of the object comprises tracking the fiducial marker by the tracking system.
Example 10: the method of any of embodiments 1-9, further comprising determining, by the surgical hub, a range of motion of a surgical instrument based on the tracked fiducial markers; wherein displaying the graphic includes displaying, by the augmented reality display device, a graphic representing the range of motion of the surgical instrument.
Example 11: a surgical system for mixed reality visualization, the system comprising: a first visualization system comprising a first camera configured to capture an image of an object in a surgical field of view, wherein a first portion of the object is located outside of the field of view of the camera; a first tracking system configured to be able to track a position of a second portion of the object; a surgical hub configured to determine a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and an augmented reality display device configured to display a captured image of the object in the surgical field and a graphic based on the attribute of the object; wherein the object comprises a surgical instrument, patient tissue, a user, or a combination thereof.
Example 12: the system of embodiment 11, wherein the attribute of the object includes a location of the first portion of the object; and wherein the graphic comprises a rendered image of the first portion of the object.
Example 13: the method of any one of embodiments 11 to 12, wherein the tracking system comprises the visualization system; wherein the visualization system comprises a second camera configured to be capable of capturing an image of the second portion of the object; and wherein the second portion of the object is located outside the field of view of the first camera of the first visualization system.
Example 14: the system of any of embodiments 11-13, wherein the tracking system uses a structured light sensor, a light detection and ranging (LIDAR) sensor, a Radio Frequency Identification (RFID), a Global Positioning System (GPS) tracking, an audio beacon, a non-visual light tracking, or a combination thereof.
Example 15: the system of any of embodiments 11-14, wherein the tracking system is configured to track a location of a structure in the surgical field of view; wherein the surgical hub is configured to identify interaction of the first portion of the object with the structure; and wherein the attribute of the object includes the identified interaction.
Example 16: the system of any of embodiments 11-15, wherein the augmented reality display device is configured to be capable of displaying a graphic based on the location of the structure.
Example 17: the system of any of embodiments 11-16, wherein the graphic based on the attribute of the object includes a warning based on the identified interaction.
Example 18: the system of any of embodiments 11-17, wherein the graphic based on the attribute of the object includes a force of the interaction, a speed of the interaction, an indication of an impact of the first portion of the object with the structure, a power-on state of the object, a time, or a combination thereof.
Example 19: the system of any of embodiments 11-18, wherein the object comprises the surgical instrument; wherein the surgical instrument comprises a fiducial marker; and wherein the tracking system is configured to be able to track the fiducial markers.
Example 20: the system of any of embodiments 11-19, wherein the surgical hub is configured to determine a range of motion of a surgical instrument based on the tracked fiducial markers; wherein the attribute of the object comprises the range of motion of the surgical instrument; and wherein the graphic based on the attribute of the object comprises a graphic representing the range of motion of the surgical instrument.
While various forms have been illustrated and described, it is not the intention of the applicant to restrict or limit the scope of the appended claims to such detail. Many modifications, variations, changes, substitutions, combinations, and equivalents of these forms may be made by one skilled in the art without departing from the scope of the disclosure. Furthermore, the structure of each element associated with the described form may alternatively be described as a means for providing the function performed by the element. In addition, where materials for certain components are disclosed, other materials may be used. It is, therefore, to be understood that the foregoing detailed description and the appended claims are intended to cover all such modifications, combinations, and variations as fall within the scope of the disclosed forms of the invention. The appended claims are intended to cover all such modifications, variations, changes, substitutions, modifications and equivalents.
The foregoing detailed description has set forth various forms of the apparatus and/or methods via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein may be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and/or hardware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product or products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing media used to actually carry out the distribution.
Instructions for programming logic to perform the various disclosed aspects can be stored within a memory in a system, such as Dynamic Random Access Memory (DRAM), cache, flash memory, or other memory. Furthermore, the instructions may be distributed via a network or by other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to floppy diskettes, optical disks, compact disk read-only memories (CD-ROMs), and magneto-optical disks, read-only memories (ROMs), random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or a tangible, machine-readable storage device for use in transmitting information over the internet via electrical, optical, acoustic, or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Thus, a non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
As used in any aspect herein, the term "control circuitry" may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor including one or more separate instruction processing cores, processing units, processors, microcontrollers, microcontroller units, controllers, digital Signal Processors (DSPs), programmable Logic Devices (PLDs), programmable Logic Arrays (PLAs), field Programmable Gate Arrays (FPGAs)), state machine circuitry, firmware storing instructions executed by the programmable circuitry, and any combination thereof. The control circuitry may be implemented collectively or individually as circuitry forming part of a larger system, such as an Integrated Circuit (IC), an Application Specific Integrated Circuit (ASIC), a system-on-a-chip (SoC), a desktop computer, a laptop computer, a tablet computer, a server, a smart phone, or the like. Thus, as used herein, "control circuitry" includes, but is not limited to, electronic circuitry having at least one discrete circuit, electronic circuitry having at least one integrated circuit, electronic circuitry having at least one application specific integrated circuit, electronic circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program that at least partially implements the methods and/or apparatus described herein, or a microprocessor configured by a computer program that at least partially implements the methods and/or apparatus described herein), electronic circuitry forming a memory device (e.g., forming a random access memory), and/or electronic circuitry forming a communication device (e.g., a modem, communication switch, or optoelectronic device). Those skilled in the art will recognize that the subject matter described herein may be implemented in analog or digital fashion, or some combination thereof.
As used in any aspect herein, the term "logic" may refer to an application, software, firmware, and/or circuitry configured to be capable of performing any of the foregoing operations. The software may be embodied as software packages, code, instructions, instruction sets, and/or data recorded on a non-transitory computer readable storage medium. The firmware may be embodied as code, instructions or a set of instructions and/or data that are hard-coded (e.g., non-volatile) in a memory device.
As used in any aspect herein, the terms "component," "system," "module," and the like can refer to a control circuit, a computer-related entity, hardware, a combination of hardware and software, or software in execution.
As used in any aspect herein, an "algorithm" refers to an organized sequence of steps leading to a desired result, wherein "step" refers to the manipulation of physical quantities and/or logical states, which may, but need not, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Are often used to refer to signals such as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or conditions.
The network may comprise a packet switched network. The communication devices may be capable of communicating with each other using the selected packet switched network communication protocol. One exemplary communication protocol may include an ethernet communication protocol that may be capable of allowing communication using transmission control protocol/internet protocol (TCP/IP). The ethernet protocol may conform to or be compatible with the ethernet Standard titled "IEEE 802.3Standard" published by the Institute of Electrical and Electronics Engineers (IEEE) at month 12 of 2008 and/or a higher version of the Standard. Alternatively or additionally, the communication devices may be capable of communicating with each other using an x.25 communication protocol. The x.25 communication protocol may conform to or be compatible with standards promulgated by the international telecommunications union telecommunication standardization sector (ITU-T). Alternatively or additionally, the communication devices may be capable of communicating with each other using a frame relay communication protocol. The frame relay communication protocol may conform to or be compatible with standards promulgated by the international telegraph and telephone Consultation Committee (CCITT) and/or the American National Standards Institute (ANSI). Alternatively or additionally, the transceivers may be capable of communicating with each other using an Asynchronous Transfer Mode (ATM) communication protocol. The ATM communication protocol may conform to or be compatible with the ATM standard promulgated by the ATM forum at month 8 of 2001 under the name "ATM-MPLS Network Interworking 2.0" and/or a higher version of the standard. Of course, different and/or later developed connection oriented network communication protocols are likewise contemplated herein.
Unless specifically stated otherwise as apparent from the above disclosure, it is appreciated that throughout the above disclosure, discussions utilizing terms such as "processing," "computing," "calculating," "determining," "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
One or more components may be referred to herein as "configured to be capable of", "configurable to be capable of", "operable/operable", "adapted/adaptable", "capable of", "conformable/conformable", and the like. Those skilled in the art will recognize that "configured to be capable of" may generally encompass active and/or inactive and/or standby components unless the context indicates otherwise.
The terms "proximal" and "distal" are used herein with respect to a clinician manipulating a handle portion of a surgical instrument. The term "proximal" refers to the portion closest to the clinician, and the term "distal" refers to the portion located away from the clinician. It will also be appreciated that for simplicity and clarity, spatial terms such as "vertical," "horizontal," "upper," and "lower" may be used herein with respect to the drawings. However, surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.
Those skilled in the art will recognize that, in general, terms used herein, and particularly in the appended claims (e.g., bodies of the appended claims) are generally intended to be "open" terms (e.g., the term "including" should be construed as "including but not limited to," the term "having" should be construed as "having at least," the term "comprising" should be construed as "including but not limited to," etc.). It will be further understood by those with skill in the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases "at least one" and "one or more" to introduce claim(s). However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles "a" or "an" limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles such as "a" or "an" (e.g., "a" and/or "an" should typically be interpreted to mean "at least one" or "one or more"); the same holds true for the use of definite articles used to introduce claim recitations.
In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of "two recitations," without other modifiers, typically means at least two recitations, or two or more recitations). Moreover, in those instances where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems having only a, only B, only C, A and B together, a and C together, B and C together, and/or A, B and C together, etc.). In those instances where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" shall include but not be limited to systems having only a, only B, only C, A and B together, a and C together, B and C together, and/or A, B and C together, etc.). It will be further understood by those within the art that, in general, unless the context indicates otherwise, disjunctive words and/or phrases presenting two or more alternative terms in the detailed description, claims, or drawings should be understood to encompass the possibility of including one of the terms, either of the terms, or both. For example, the phrase "a or B" will generally be understood to include the possibility of "a" or "B" or "a and B".
For the purposes of the appended claims, those skilled in the art will understand that the operations recited therein can generally be performed in any order. Additionally, while various operational flow diagrams are set forth in one or more sequences, it should be understood that various operations may be performed in other sequences than the illustrated sequences, or may be performed concurrently. Examples of such alternative ordering may include overlapping, staggered, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other altered ordering unless the context dictates otherwise. Moreover, unless the context dictates otherwise, terms such as "responsive to," "related to," or other past-type adjectives are generally not intended to exclude such variants.
It is worth mentioning that any reference to "an aspect", "an example" means that a particular feature, structure or characteristic described in connection with the aspect is included in at least one aspect. Thus, the appearances of the phrases "in one aspect," "in an example," and "in an example" in various places throughout this specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more aspects.
Any patent application, patent, non-patent publication, or other disclosure material referred to in this specification and/or listed in any application data sheet is incorporated herein by reference, as if the incorporated material was not inconsistent herewith. Accordingly, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.
In summary, many of the benefits resulting from employing the concepts described herein have been described. The foregoing detailed description of one or more forms has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications or variations of the present invention are possible in light of the above teachings. One or more of the forms selected and described are chosen to illustrate principles and practical application to thereby enable one of ordinary skill in the art to utilize various forms and various modifications as are suited to the particular use contemplated. The claims filed herewith are intended to define the full scope.

Claims (20)

1. A method for mixed reality visualization of a surgical system, the method comprising:
capturing, by a first camera of a first visualization system, an image of an object in a surgical field of view, wherein a first portion of the object is located outside of the field of view of the first camera;
tracking, by a tracking system, a position of a second portion of the object;
determining, by a surgical hub, a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and
displaying, by an augmented reality display device, a captured image of the object in the surgical field and a graphic based on the attribute of the object;
wherein the object comprises a surgical instrument, patient tissue, or a user, or a combination thereof.
2. The method of claim 1, wherein determining the attribute of the object based on the tracked position of the second portion of the object comprises determining a position of the first portion of the object; and is also provided with
Wherein displaying the graphic includes displaying, by the augmented reality display device, a rendered image of the first portion of the object.
3. The method of claim 1, wherein the tracking system comprises the visualization system;
wherein the visualization system comprises a second camera;
wherein tracking the location of the second portion of the object comprises capturing, by the second camera, an image of the second portion of the object; and is also provided with
Wherein the second portion of the object is located outside the field of view of the first camera.
4. The method of claim 1, wherein tracking, by the tracking system, the location of the second portion of the object comprises tracking the second portion of the object using a structured light sensor, a light detection and ranging (LIDAR) sensor, a Radio Frequency Identification (RFID), a Global Positioning System (GPS) tracking, an audio beacon, a non-visual light tracking, or a combination thereof.
5. The method of claim 1, further comprising tracking, by the tracking system, a location of a structure in the surgical field of view;
wherein determining the attribute of the object based on the tracked position of the second portion of the object includes identifying, by the surgical hub, interaction of the first portion of the object with the structure.
6. The method of claim 5, further comprising displaying, by the augmented reality display device, a graphic based on the location of the structure.
7. The method of claim 5, wherein displaying the graphic based on the attribute of the object comprises displaying, by the augmented reality display device, a warning based on the identified interaction.
8. The method of claim 5, wherein displaying the graphic based on the attribute of the object comprises displaying, by the augmented reality display device, a force of the interaction, a speed of the interaction, an indication of an impact of the first portion of the object with the structure, a power-on state of the object, a time, or a combination thereof.
9. The method of claim 1, wherein the object comprises a surgical instrument comprising a fiducial marker; and is also provided with
Wherein tracking the position of the second portion of the object includes tracking the fiducial marker by the tracking system.
10. The method of claim 9, further comprising determining, by the surgical hub, a range of motion of a surgical instrument based on the tracked fiducial markers; and is also provided with
Wherein displaying the graphic includes displaying, by the augmented reality display device, a graphic representing the range of motion of the surgical instrument.
11. A surgical system for mixed reality visualization, the system comprising:
a first visualization system comprising a first camera configured to capture an image of an object in a surgical field of view, wherein a first portion of the object is located outside of the field of view of the camera;
a first tracking system configured to be able to track a position of a second portion of the object;
a surgical hub configured to determine a property of the object based on the tracked position of the second portion of the object, wherein the property of the object is related to the first portion of the object that is outside of the field of view of the camera; and
an augmented reality display device configured to display a captured image of the object in the surgical field and a graphic based on the attribute of the object;
wherein the object comprises a surgical instrument, patient tissue, a user, or a combination thereof.
12. The surgical system of claim 11, wherein the attribute of the object comprises a position of the first portion of the object; and is also provided with
Wherein the graphic comprises a rendered image of the first portion of the object.
13. The surgical system of claim 11, wherein the tracking system comprises the visualization system;
wherein the visualization system comprises a second camera configured to be capable of capturing an image of the second portion of the object; and is also provided with
Wherein the second portion of the object is located outside the field of view of the first camera of the first visualization system.
14. The surgical system of claim 11, wherein the tracking system uses a structured light sensor, a light detection and ranging (LIDAR) sensor, a Radio Frequency Identification (RFID), a Global Positioning System (GPS) tracking, an audio beacon, a non-visual light tracking, or a combination thereof.
15. The surgical system of claim 11, wherein the tracking system is configured to track a location of a structure in the surgical field of view;
wherein the surgical hub is configured to identify interaction of the first portion of the object with the structure; and is also provided with
Wherein the attribute of the object includes the identified interaction.
16. The surgical system of claim 5, wherein the augmented reality display device is configured to display a graphic based on the location of the structure.
17. The surgical system of claim 5, wherein the graphic based on the attribute of the object comprises a warning based on the identified interaction.
18. The surgical system of claim 5, wherein the graphic based on the attribute of the object comprises a force of the interaction, a speed of the interaction, an indication of an impact of the first portion of the object with the structure, a power-on state of the object, a time, or a combination thereof.
19. The surgical system of claim 11, wherein the object comprises the surgical instrument;
wherein the surgical instrument comprises a fiducial marker; and is also provided with
Wherein the tracking system is configured to be able to track said fiducial markers.
20. The surgical system of claim 19, wherein the surgical hub is configured to determine a range of motion of a surgical instrument based on the tracked fiducial markers;
Wherein the attribute of the object comprises the range of motion of the surgical instrument; and is also provided with
Wherein the graphic based on the attribute of the object comprises a graphic representing the range of motion of the surgical instrument.
CN202280040630.4A 2021-04-14 2022-04-11 Visualizing a mixture directly with rendering elements to display the mixture elements and actions occurring on and off screen Pending CN117441212A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US63/174,674 2021-04-14
US63/284,326 2021-11-30
US17/688,653 US20220331013A1 (en) 2021-04-14 2022-03-07 Mixing directly visualized with rendered elements to display blended elements and actions happening on-screen and off-screen
US17/688,653 2022-03-07
PCT/IB2022/053370 WO2022219498A1 (en) 2021-04-14 2022-04-11 Mixing directly visualized with rendered elements to display blended elements and actions happening on-screen and off-screen

Publications (1)

Publication Number Publication Date
CN117441212A true CN117441212A (en) 2024-01-23

Family

ID=89553987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280040630.4A Pending CN117441212A (en) 2021-04-14 2022-04-11 Visualizing a mixture directly with rendering elements to display the mixture elements and actions occurring on and off screen

Country Status (1)

Country Link
CN (1) CN117441212A (en)

Similar Documents

Publication Publication Date Title
US20220331013A1 (en) Mixing directly visualized with rendered elements to display blended elements and actions happening on-screen and off-screen
EP4136648A1 (en) Intraoperative display for surgical systems
JP2024517603A (en) Selective and adjustable mixed reality overlays in the surgical field
WO2022219491A1 (en) System and method for tracking a portion of the user as a proxy for non-monitored instrument
WO2022219501A1 (en) System comprising a camera array deployable out of a channel of a tissue penetrating surgical device
JP2024514884A (en) Adaptability and tunability of overlay instrument information for surgical systems
CN117441212A (en) Visualizing a mixture directly with rendering elements to display the mixture elements and actions occurring on and off screen
JP2024514640A (en) Blending visualized directly on the rendered element showing blended elements and actions occurring on-screen and off-screen
CN117461093A (en) System and method for changing a display overlay of a surgical field based on a trigger event
EP4136656A1 (en) Systems and methods for changing display overlay of surgical field view based on triggering events
WO2022219504A1 (en) Cooperative overlays of interacting instruments which result in both overlays being effected
CN117480562A (en) Selective and adjustable mixed reality overlay in surgical field of view
CN118160044A (en) Customization of overlay data and configuration
CN117480563A (en) Intraoperative display for surgical system
CN117546252A (en) Adaptability and adjustability of surgical systems or superimposing instrument information
CN117479896A (en) System comprising a camera array deployable outside a channel of a tissue penetrating surgical device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination