[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113282141A - Wearable portable computer and teaching platform based on mix virtual reality - Google Patents

Wearable portable computer and teaching platform based on mix virtual reality Download PDF

Info

Publication number
CN113282141A
CN113282141A CN202110602374.0A CN202110602374A CN113282141A CN 113282141 A CN113282141 A CN 113282141A CN 202110602374 A CN202110602374 A CN 202110602374A CN 113282141 A CN113282141 A CN 113282141A
Authority
CN
China
Prior art keywords
sensor
assembly
computer
content
backpack
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110602374.0A
Other languages
Chinese (zh)
Inventor
吴慧欣
陈继坤
彭锋
杨梦凡
彭馨予
宋文辉
刘孟轩
宋宗珀
安丽鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Zhongmeng Electronic Technology Co ltd
North China University of Water Resources and Electric Power
Original Assignee
Henan Zhongmeng Electronic Technology Co ltd
North China University of Water Resources and Electric Power
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Zhongmeng Electronic Technology Co ltd, North China University of Water Resources and Electric Power filed Critical Henan Zhongmeng Electronic Technology Co ltd
Priority to CN202110602374.0A priority Critical patent/CN113282141A/en
Publication of CN113282141A publication Critical patent/CN113282141A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a wearable portable computer based on mixed virtual reality. The wearable portable computer comprises a backpack computer, a head-mounted display, a leg sensor, a foot sensor and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer; the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface; the head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers; the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface. The invention has better compatibility and expansibility to peripheral equipment, and can be applied to teaching scenes.

Description

Wearable portable computer and teaching platform based on mix virtual reality
Technical Field
The invention relates to the technical field of wearable equipment, in particular to a wearable portable computer and a teaching platform based on mixed virtual reality.
Background
With the development of screen display technology, more and more display technologies can be selected, and the semi-transparent LED display technology and the portable projection technology which appear recently enrich the realization forms of display equipment and lay the foundation for related applications. However, currently, common AR and game devices mainly include handheld game devices, mobile phones, tablet computers, wearable devices, and the like, and still interact with users through the conventional portable displays, which has certain limitations, for example, the head display needs to occupy the visual field of the users, which interferes the users to observe the real world, and the wearable computer cannot rapidly switch the use scenes; the portable head display is incompatible with connection of the AR, the mobile phone and the game equipment; the portable head display has poor functional expansibility; the existing partially transparent panel cannot adjust the transparency and has poor effect under the use of a specific scene; there is no relevant wearable device and mixed reality application used in scenes such as classroom teaching.
Disclosure of Invention
Aiming at the problems of poor compatibility and poor expansibility of the existing wearable equipment during interaction with a user, the invention provides a wearable portable computer and a teaching platform based on mixed virtual reality.
In one aspect, the invention provides a wearable portable computer based on mixed virtual reality, comprising a backpack computer, a head-mounted display, a leg sensor, a foot sensor and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer;
the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface;
the head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers;
the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface, and the wireless sensing handle is used for interacting with a backpack computer; the leg sensor and the foot sensor are used for acquiring the motion postures and the motion states of the corresponding parts, so that the computer host can control the computer according to the motion postures and the motion states.
Furthermore, a power interface is further arranged inside the backpack, and the power interface is used for being connected with an external power supply, and the external power supply supplies power to the battery.
Furthermore, the head-mounted display comprises a helmet, a visual interaction unit, a sound interaction unit, a sensor unit and a power supply unit which are all hung on the USB bus;
the visual interaction unit comprises an adjustable transparent screen assembly and a projection display assembly, and the adjustable transparent screen assembly and the projection display assembly are connected to an equipment interconnection interface through a cable or a back plate;
the sound interaction unit comprises an audio input module, an audio output module, a microphone array, an earphone interface and a wireless audio interface; the microphone array comprises a speech microphone and an ambient microphone; the audio input module is used for processing sound information acquired by the microphone array and transmitting the processed data to the backpack computer through the equipment interconnection interface; the audio output module is used for decoding the audio signal received from the video stream audio interface and then transmitting the decoded audio signal to the earphone interface and/or the wireless audio interface;
the sensor unit comprises a GPS module, an attitude sensor, a motion sensor, an environment sensor, a wearing sensor, a biosensor and a camera;
the power supply unit comprises a power supply interface used for accessing an external power supply.
Further, the adjustable transparent screen assembly includes a display assembly, a display assembly holder, and an optical assembly; the display assembly and the optical assembly are connected to a helmet by a display assembly holder;
the optical component is used for adjusting the light path from the display component to an observer;
the adjustable transparent screen assembly has a stowed state and a use state; in the retracted state, the adjustable transparent screen assembly is automatically closed; in the use state, an observer observes the screen content through the display component or observes the real world through the transparent screen.
Further, the projection display assembly comprises a projection assembly and a stabilizer assembly; the projection assembly is connected to a helmet through the stabilizer assembly; the projection assembly comprises a laser ranging sensor and an anti-shake lens; the stabilizer assembly includes a roll stabilizer and a pitch stabilizer.
In another aspect, the invention provides a teaching platform of a wearable portable computer based on mixed virtual reality, which comprises the wearable portable computer, a cloud platform and a local platform;
the cloud platform comprises a user management unit, an application/content mall unit and a cloud service unit; the user management unit is used for managing user accounts, equipment, subscriptions and purchases; the application/content mall unit is used for the content producer to publish the content and the content consumer to purchase, download and subscribe the related content; the cloud service unit is used for providing friend communication related functions, content related communities, online platforms between equipment and relevant SDK support of application/content;
the local platform comprises a multi-user management unit, an application and content management unit, a storage management unit and an equipment management unit; the multi-user management unit is used for accessing respective cloud services for each user, providing personalized settings and managing digital authorization held by each user; the application and content management unit is used for managing the content and resources in the local platform and realizing the addition, deletion and operation of related resources; the storage management unit is used for managing each storage medium of the local platform, reading the content in the storage medium and the authorization with the card, and managing the user data in the storage medium; the device management unit is used for setting hardware drivers and hardware related functions of the device, so that the device can operate according to the expectation of a user.
The invention has the beneficial effects that:
the adjustable transparent screen component and the projection display component in the head-mounted display provide visual interaction functions for users and observers, wherein the adjustable transparent screen component uses an electrochromic technology, so that the transmittance of a part of transparent screens can be adjusted, and the switching between AR-VR is realized; and a USB bus is used for providing a unified interconnection and expansion scheme for each component or peripheral equipment; the GPS module, the camera, the motion sensor, the environment sensor and other sensors are used, so that better AR and game use experience is provided for users; in addition, the invention can also be applied to teaching scenes such as classrooms and the like.
Drawings
Fig. 1 is a block diagram of a wearable portable computer based on mixed virtual reality according to an embodiment of the present invention;
FIG. 2 is a block diagram of a head mounted display according to an embodiment of the present invention;
fig. 3 is a block diagram of a sound interaction unit according to an embodiment of the present invention;
fig. 4 is a block diagram of a sensor unit according to an embodiment of the present invention;
fig. 5 is a schematic diagram of PD chain powering provided by an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an adjustable transparent screen assembly according to an embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a projection display module according to an embodiment of the present invention;
fig. 8 is a functional structure schematic diagram of a cloud platform according to an embodiment of the present invention;
fig. 9 is a functional structure diagram of a local platform according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a wearable portable computer based on mixed virtual reality, including a backpack computer, a head-mounted display, a leg sensor, a foot sensor, and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer;
the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface;
as an implementation manner, a power interface may be further disposed inside the backpack, and the power interface is used for accessing an external power source, and the battery is powered by the external power source. When the battery is fully charged or reaches a certain value, the charging is stopped, the external power supply is used for supplying power, and when the external power supply stops supplying power, the external power supply is converted into the battery for supplying power. The cooling device is used for reducing the temperature of the battery, the computer and related modules thereof and ensuring the normal operation of the device. The device interconnect interface is used to power and communicate with peripheral devices. The backpack computer may identify the device according to the USB protocol, utilizing the various functions provided by the peripheral device. The backpack computer comprises a wireless communication module for connecting the Internet, other equipment and human-computer interaction equipment. The user controls the computer related functions through the human-computer interaction device or the peripheral device.
The head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers;
the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface, and the wireless sensing handle is used for interacting with a backpack computer; the leg sensor and the foot sensor are used for acquiring the motion postures and the motion states of the corresponding parts, so that the computer host can control the computer according to the motion postures and the motion states.
The wearable portable computer based on the mixed virtual reality can provide mixed reality vision for a user and surrounding observers through the head-mounted display.
On the basis of the above embodiment, as an implementable manner, the head-mounted display includes a helmet, and a visual interaction unit, a sound interaction unit, a sensor unit and a power supply unit all hung on a USB bus;
specifically, as shown in fig. 2, each component of the different units may adopt a universal serial bus, and may be hung on a USB bus.
The visual interaction unit comprises an adjustable transparent screen assembly and a projection display assembly, and the adjustable transparent screen assembly and the projection display assembly are connected to an equipment interconnection interface through a cable or a back plate;
in particular, the adjustable transparent screen assembly is used for user visual interaction, enabling a user to observe a real environment as much as possible when not being displayed; the projection display component is used for visual interaction of users, and is used for projecting display contents to the ground for the users and surrounding users to watch on the premise of ensuring safety.
As shown in fig. 3, the sound interaction unit includes an audio input module, an audio output module, a microphone array, an earphone interface, and a wireless audio interface; the microphone array comprises a speech microphone and an ambient microphone; the audio input module is used for processing sound information acquired by the microphone array and transmitting the processed data to the backpack computer through the equipment interconnection interface; the audio output module is used for decoding the audio signal received from the video stream audio interface and then transmitting the decoded audio signal to the earphone interface and/or the wireless audio interface;
specifically, the sound interaction unit is used for voice and auditory interaction, can enable a user to hear multimedia sound while hearing environmental sound, can also isolate mute and increase immersion, and is provided with a microphone to collect environmental and user voices and instructions. The audio input module processes sound according to requirements and then codes the sound into a format which can be recognized by equipment after processing such as active noise reduction, audio amplification, background sound elimination, voice enhancement, sound channel combination, ear return, noise measurement and the like. The earphone interface is used for connecting an earphone, and the wireless audio interface is used for connecting wireless audio equipment; the wireless audio interface can transmit sound to the playing device in a Bluetooth mode, a frequency modulation mode, an amplitude modulation mode and the like.
The sensor unit comprises a GPS module, an attitude sensor, a motion sensor, an environment sensor, a wearing sensor, a biosensor and a camera;
in particular, the design of the sensor unit is shown in fig. 4. The GPS module is used for acquiring the geographical position of the user by utilizing a global positioning system, so that more accurate information can be conveniently acquired by positioning-based application. The motion and gesture sensor is used for detecting the motion and gesture of the user and providing relevant sensor data for the application. The environment sensor is used for enhancing the perception ability of the user, detecting the surrounding environment state of the user and supporting related applications. The wearing sensor is used for detecting the wearing state of the user and controlling the operation mode of the equipment. The biosensor is used for detecting the physiological condition of a user and supporting related applications. The user can configure the number of cameras as required, and for example, the system can include a main visual angle camera, an auxiliary camera and an environment camera, wherein the main visual angle camera is used for recording and AR application, the auxiliary camera is used for auxiliary imaging or completing specific application in cooperation with other cameras, and the environment camera is used for enhancing user perception capability, detecting the surrounding environment state of the user and supporting related application.
The power supply unit comprises a power supply interface used for accessing an external power supply.
Specifically, as an implementable manner, as shown in fig. 5, the power supply is implemented by adopting a PD chain power supply scheme. The external power supply supplies power to the mobile power supply in a PD mode, and the mobile power supply supplies power to the core main equipment. The core main device uses a PD mode to supply power for the peripheral device. Each device comprises a power management chip which converts the PD power supply into the power specification required by each part and then supplies power to the corresponding part. The user can adjust the relation of each part of chain as required, adjusts the connection and the primary and secondary relation of each part in the power supply chain.
On the basis of the above embodiments, structurally, as shown in fig. 6, the adjustable transparent screen assembly in the embodiment of the present invention includes a display assembly 11 (for example, an adjustable transparent screen using electrochromic technology as a display assembly), a display assembly holder 12, and an optical assembly 13; the display component 11 and the optical component 13 are connected to the helmet 14 through the display component holder 12 (for example, the display component 11 and the optical component 13 can be connected to a rotating shaft of the helmet 14 through a connector), and the optical component 13 is used for adjusting the optical path from the display component to the observer; the adjustable transparent screen assembly has a stowed state and a use state; in the retracted state, the adjustable transparent screen assembly is automatically closed; in the use state, an observer observes the screen content through the display component or observes the real world through the transparent screen.
On the basis of the above embodiments, structurally, as shown in fig. 7, the projection display assembly in the embodiment of the present invention includes a projection assembly 21 and a stabilizer assembly 22; the projection assembly 21 is connected to the helmet 14 through the stabilizer assembly 22; the projection assembly 21 comprises a laser ranging sensor 211 and an anti-shake lens 212; the stabilizer assembly 22 includes a roll stabilizer 221 and a pitch stabilizer 222.
Specifically, the projection assembly is connected with the helmet through the stabilizer assembly, and the connection mode can be determined according to requirements. The stabilizer is used for keeping the projected image stable according to the equipment needs, and the combination mode and the installation mode of the stabilizer can be determined according to the needs. The laser ranging sensor is used for automatic focusing of a projected image, and the anti-shake lens can automatically focus and keep a picture stable to prevent shaking within a certain range.
The embodiment of the invention also provides a teaching platform of the wearable portable computer based on the mixed virtual reality, which comprises the wearable portable computer, a cloud platform and a local platform in the embodiments;
as shown in fig. 8, the cloud platform includes a user management unit, an application/content mall unit, and a cloud service unit; the user management unit is used for managing user accounts, equipment, subscriptions and purchases; the application/content mall unit is used for the content producer to publish the content and the content consumer to purchase, download and subscribe the related content; the cloud service unit is used for providing friend communication related functions, content related communities, online platforms between equipment and relevant SDK support of application/content;
as shown in fig. 9, the local platform includes a multi-user management unit, an application and content management unit, a storage management unit, and a device management unit; the multi-user management unit is used for accessing respective cloud services for each user, providing personalized settings and managing digital authorization held by each user; the application and content management unit is used for managing the content and resources in the local platform and realizing the addition, deletion and operation of related resources; the storage management unit is used for managing each storage medium of the local platform, reading the content in the storage medium and the authorization with the card, and managing the user data in the storage medium; the device management unit is used for setting hardware drivers and hardware related functions of the device, so that the device can operate according to the expectation of a user.
Specifically, the mixed virtual reality is realized by a retractable adjustable transparent screen component, a user can see the mixed reality content through a panel, and surrounding observers can observe the main visual field of the user through a portable projection. The user can use the mixed reality with him or her by carrying the wearable portable computer, and can use the computer as required in playgrounds, classrooms and other places without dividing the place. The wearable portable computer can transmit the picture to a remote screen in a wireless screen projection mode and the like, so that the visual field remote display is realized; the wearable portable computer can acquire the related mixed reality application through the application store or acquire the related mixed reality application in the form of an application card.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (6)

1.基于混合虚拟现实的可穿戴便携计算机,其特征在于,包括背包计算机、头戴显示器、腿部传感器、脚部传感器和无线传感手柄;所述头戴显示器和所述无线传感手柄均与所述背包计算机通信连接;1. A wearable portable computer based on mixed virtual reality, characterized in that it includes a backpack computer, a head-mounted display, a leg sensor, a foot sensor and a wireless sensor handle; the head-mounted display and the wireless sensor handle are both in communication with the backpack computer; 所述背包计算机为采用背包形式携带的计算机,在所述背包内部设置有电池、计算机主机和冷却设备;所述电池为所述计算机主机和所述冷却设备供电;所述计算机主机上设置有设备互联接口;The backpack computer is a computer carried in the form of a backpack, and a battery, a computer host and a cooling device are arranged inside the backpack; the battery supplies power for the computer host and the cooling device; the computer host is provided with a device interconnection interface; 所述头戴显示器连接至设备互联接口,用于为用户及周围观察者提供混合现实视觉;The head-mounted display is connected to the device interconnection interface for providing mixed reality vision for the user and surrounding observers; 所述腿部传感器、脚部传感器和无线传感手柄连接至设备互联接口,所述无线传感手柄用于与背包计算机进行交互;所述腿部传感器、脚部传感器用于采集对应部位的运动姿势、运动状态,以供计算机主机根据运动姿势、运动状态控制计算机。The leg sensor, the foot sensor and the wireless sensor handle are connected to the device interconnection interface, and the wireless sensor handle is used to interact with the backpack computer; the leg sensor and the foot sensor are used to collect the motion of the corresponding part The posture and motion state are used by the computer host to control the computer according to the motion posture and motion state. 2.根据权利要求1所述的可穿戴便携计算机,其特征在于,在所述背包内部还设置有电源接口,所述电源接口用于接入外部电源,通过所述外部电源为所述电池供电。2 . The wearable portable computer according to claim 1 , wherein a power interface is further provided inside the backpack, and the power interface is used to connect to an external power source, and the battery is powered by the external power source. 3 . . 3.根据权利要求1所述的可穿戴便携计算机,其特征在于,所述头戴显示器包括头盔,以及均挂接在USB总线上的视觉交互单元、声音交互单元、传感器单元和电源单元;3. The wearable portable computer according to claim 1, wherein the head-mounted display comprises a helmet, and a visual interaction unit, a sound interaction unit, a sensor unit and a power supply unit all connected to the USB bus; 所述视觉交互单元包括可调透明屏幕组件和投影显示组件,所述可调透明屏幕组件和所述投影显示组件通过电缆或背板连接至设备互联接口;The visual interaction unit includes an adjustable transparent screen assembly and a projection display assembly, and the adjustable transparent screen assembly and the projection display assembly are connected to a device interconnection interface through a cable or a backplane; 所述声音交互单元包括音频输入模块、音频输出模块、麦克风阵列、耳机接口和无线音频接口;所述麦克风阵列包括语音麦克风和环境麦克风;所述音频输入模块,用于处理麦克风阵列采集的声音信息并将处理后的数据通过设备互联接口传送至背包计算机;所述音频输出模块,用于对从视频流音频接口接收到的音频信号进行解码后传输至耳机接口和/或无线音频接口;The sound interaction unit includes an audio input module, an audio output module, a microphone array, an earphone interface and a wireless audio interface; the microphone array includes a voice microphone and an ambient microphone; the audio input module is used to process the sound information collected by the microphone array and transmit the processed data to the backpack computer through the device interconnection interface; the audio output module is used to decode the audio signal received from the video stream audio interface and transmit it to the headphone interface and/or the wireless audio interface; 所述传感器单元包括GPS模块、姿态传感器、运动传感器、环境传感器、佩戴传感器、生物传感器和摄像头;The sensor unit includes a GPS module, an attitude sensor, a motion sensor, an environment sensor, a wearing sensor, a biosensor and a camera; 所述电源单元包括用于接入外部电源的供电接口。The power supply unit includes a power supply interface for accessing an external power supply. 4.根据权利要求3所述的可穿戴便携计算机,其特征在于,所述可调透明屏幕组件包括显示组件、显示组件固定器和光学组件;所述显示组件和所述光学组件通过显示组件固定器连接至头盔;4. The wearable portable computer according to claim 3, wherein the adjustable transparent screen assembly comprises a display assembly, a display assembly holder and an optical assembly; the display assembly and the optical assembly are fixed by the display assembly connected to the helmet; 所述光学组件,用于调整显示组件至观察者的光路;the optical component for adjusting the optical path of the display component to the observer; 所述可调透明屏幕组件具有收起状态和使用状态;在收起状态下,可调透明屏幕组件自动关闭;在使用状态下,观察者通过显示组件观察屏幕内容,或透过透明屏幕观察现实世界。The adjustable transparent screen assembly has a retracted state and a use state; in the retracted state, the adjustable transparent screen assembly is automatically closed; in the use state, the observer observes the screen content through the display assembly, or observes reality through the transparent screen world. 5.根据权利要求3所述的可穿戴便携计算机,其特征在于,所述投影显示组件包括投影组件和稳定器组件;所述投影组件通过所述稳定器组件连接至头盔;所述投影组件包括用于激光测距传感器和防抖镜头;所述稳定器组件包括横滚稳定器和俯仰稳定器。5. The wearable portable computer according to claim 3, wherein the projection display assembly comprises a projection assembly and a stabilizer assembly; the projection assembly is connected to the helmet through the stabilizer assembly; the projection assembly comprises Used for laser ranging sensor and anti-shake lens; the stabilizer assembly includes roll stabilizer and pitch stabilizer. 6.基于混合虚拟现实的可穿戴便携计算机的教学平台,其特征在于,包括权利要求1至5任一所述的可穿戴便携计算机、云平台和本地平台;6. The teaching platform of the wearable portable computer based on mixed virtual reality, is characterized in that, comprises the wearable portable computer described in any one of claim 1 to 5, cloud platform and local platform; 所述云平台包括用户管理单元、应用/内容商城单元和云服务单元;所述用户管理单元,用于管理用户账目、设备、订阅及购买;所述应用/内容商城单元,用于供内容生产者发布内容,以及内容消费者购买、下载、订阅相关内容;所述云服务单元,用于提供好友交流相关功能、内容相关社区、设备与设备之间的联机平台以及应用/内容的相关SDK 支持;The cloud platform includes a user management unit, an application/content mall unit and a cloud service unit; the user management unit is used to manage user accounts, devices, subscriptions and purchases; the application/content mall unit is used for content production Content users publish content, and content consumers purchase, download, and subscribe to related content; the cloud service unit is used to provide functions related to friend communication, content-related communities, online platforms between devices and related SDK support for applications/contents ; 所述本地平台包括多用户管理单元、应用及内容管理单元、存储管理单元和设备管理单元;所述多用户管理单元,用于面向各用户接入各自云服务,提供个性化设置,管理各用户所持有的数字授权;所述应用及内容管理单元,用于管理本地平台内的内容及资源,实现相关资源的添加、删除与运行;所述存储管理单元,用于管理本地平台的各存储介质,读取存储介质中的内容与随卡授权,管理存储介质中的用户数据;所述设备管理单元,用于设置设备硬件驱动和硬件相关功能,使设备按照用户预期运行。The local platform includes a multi-user management unit, an application and content management unit, a storage management unit and a device management unit; the multi-user management unit is used for accessing respective cloud services for each user, providing personalized settings, and managing each user The digital authorization held; the application and content management unit is used to manage the content and resources in the local platform, and the addition, deletion and operation of related resources are realized; the storage management unit is used to manage each storage of the local platform medium, reads the content in the storage medium and is authorized with the card, and manages the user data in the storage medium; the device management unit is used to set the device hardware driver and hardware-related functions, so that the device runs as expected by the user.
CN202110602374.0A 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality Pending CN113282141A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110602374.0A CN113282141A (en) 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110602374.0A CN113282141A (en) 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality

Publications (1)

Publication Number Publication Date
CN113282141A true CN113282141A (en) 2021-08-20

Family

ID=77282858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110602374.0A Pending CN113282141A (en) 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality

Country Status (1)

Country Link
CN (1) CN113282141A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114360322A (en) * 2021-12-08 2022-04-15 江西中船航海仪器有限公司 Portable navigation sextant simulator

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN104932679A (en) * 2014-03-21 2015-09-23 三星电子株式会社 Wearable device and method of operating the same
CN204695231U (en) * 2015-06-18 2015-10-07 陈会兵 Portable helmet immersion systems
CN106662747A (en) * 2014-08-21 2017-05-10 微软技术许可有限责任公司 Head-mounted display with electrochromic dimming module for augmented and virtual reality perception
US20180011682A1 (en) * 2016-07-06 2018-01-11 Bragi GmbH Variable computing engine for interactive media based upon user biometrics
CN107783639A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Virtual reality leisure learning system
CN108288242A (en) * 2018-01-31 2018-07-17 上海维拓网络科技有限公司 Teaching controlling platform and control method based on virtual reality engine technology
CN109932054A (en) * 2019-04-24 2019-06-25 北京耘科科技有限公司 Wearable Acoustic detection identifying system
CN111352239A (en) * 2018-12-22 2020-06-30 杭州融梦智能科技有限公司 Augmented reality display device and interaction method using augmented reality display device
CN211786373U (en) * 2020-03-30 2020-10-27 哈雷医用(广州)智能技术有限公司 Portable AR wears display device
CN112051895A (en) * 2020-09-03 2020-12-08 西安戴森电子技术有限公司 Wearable solid state computer

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN104932679A (en) * 2014-03-21 2015-09-23 三星电子株式会社 Wearable device and method of operating the same
CN106662747A (en) * 2014-08-21 2017-05-10 微软技术许可有限责任公司 Head-mounted display with electrochromic dimming module for augmented and virtual reality perception
CN204695231U (en) * 2015-06-18 2015-10-07 陈会兵 Portable helmet immersion systems
US20180011682A1 (en) * 2016-07-06 2018-01-11 Bragi GmbH Variable computing engine for interactive media based upon user biometrics
CN107783639A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Virtual reality leisure learning system
CN108288242A (en) * 2018-01-31 2018-07-17 上海维拓网络科技有限公司 Teaching controlling platform and control method based on virtual reality engine technology
CN111352239A (en) * 2018-12-22 2020-06-30 杭州融梦智能科技有限公司 Augmented reality display device and interaction method using augmented reality display device
CN109932054A (en) * 2019-04-24 2019-06-25 北京耘科科技有限公司 Wearable Acoustic detection identifying system
CN211786373U (en) * 2020-03-30 2020-10-27 哈雷医用(广州)智能技术有限公司 Portable AR wears display device
CN112051895A (en) * 2020-09-03 2020-12-08 西安戴森电子技术有限公司 Wearable solid state computer

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114360322A (en) * 2021-12-08 2022-04-15 江西中船航海仪器有限公司 Portable navigation sextant simulator
CN114360322B (en) * 2021-12-08 2023-02-17 江西中船航海仪器有限公司 Portable navigation sextant simulator

Similar Documents

Publication Publication Date Title
CN110830811B (en) Live broadcast interaction method, device, system, terminal and storage medium
CN109920065B (en) Information display method, device, equipment and storage medium
US9618747B2 (en) Head mounted display for viewing and creating a media file including omnidirectional image data and corresponding audio data
EP3862845A1 (en) Method for controlling display screen according to eyeball focus and head-mounted electronic equipment
US10176783B2 (en) Interactive wearable and portable smart devices
US11378802B2 (en) Smart eyeglasses
US9341866B2 (en) Spectacles having a built-in computer
RU2670784C9 (en) Orientation and visualization of virtual object
US20140333773A1 (en) Portable audio/ video mask
US20130044129A1 (en) Location based skins for mixed reality displays
US8400519B2 (en) Mobile terminal and method of controlling the operation of the mobile terminal
US12153224B2 (en) Display method, electronic device, and system
CN112965683A (en) Volume adjusting method and device, electronic equipment and medium
CN113395566B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN112770177B (en) Multimedia file generation method, multimedia file release method and device
EP3376752A1 (en) Headset
US20100283711A1 (en) An integrated computation and communication system, a framed interface therefor and a method of operating thereof
CN106020459B (en) Intelligent glasses, and control method and control system of intelligent glasses
CN213876195U (en) Glasses frame and intelligent navigation glasses
CN108848405A (en) Image processing method and device
CN113282141A (en) Wearable portable computer and teaching platform based on mix virtual reality
KR20170046947A (en) Mobile terminal and method for controlling the same
US12189120B2 (en) Highly interactive head mount display environment for gaming
JP2020025275A (en) Video and audio reproduction device and method
CN218585615U (en) AR (augmented reality) glasses circuit and AR glasses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210820