KR20160135067A - Apparatus and method for operating autonomous smart car uging analysis bio signal - Google Patents
Apparatus and method for operating autonomous smart car uging analysis bio signal Download PDFInfo
- Publication number
- KR20160135067A KR20160135067A KR1020150068212A KR20150068212A KR20160135067A KR 20160135067 A KR20160135067 A KR 20160135067A KR 1020150068212 A KR1020150068212 A KR 1020150068212A KR 20150068212 A KR20150068212 A KR 20150068212A KR 20160135067 A KR20160135067 A KR 20160135067A
- Authority
- KR
- South Korea
- Prior art keywords
- smart car
- unit
- unmanned
- human body
- body signal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000003340 mental effect Effects 0.000 claims abstract description 5
- 238000004891 communication Methods 0.000 claims description 20
- 230000000284 resting effect Effects 0.000 claims description 6
- 230000006996 mental state Effects 0.000 claims description 5
- 230000036651 mood Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000004308 accommodation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0217—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Traffic Control Systems (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
Abstract
The present invention relates to a method and an apparatus for an unmanned smart car using a human body signal analysis that can analyze a human body signal and inform the current state in real time and automatically move to a place suitable for mood. The present invention provides a method of sensing a human body signal of a user in an unmanned smart car, The sensing signal analyzing unit in the unattended smart car analyzes the bio-signals sensed by the human body signal sensing unit based on the sex, age, preferred resort, preferred food style, and physical and mental psychological state of the human according to the human signal stored in the memory unit Analyzing the data as compared to at least one of the data: according to the analyzed data, the control unit of the unmanned smart car guides the resort to the unmanned smartcar user; And controlling the unmanned smart car to perform an unmanned operation through the unmanned operation unit for unattended operation of the smart car, wherein the control unit controls the unmanned smart car using the human signal analysis do.
Description
TECHNICAL FIELD The present invention relates to an unmanned smart car, and more particularly, to a method and apparatus for unmanned smart car using human body signal analysis capable of analyzing a human body signal to inform the current state in real time and automatically moving to a place suitable for mood will be.
Recently, as technology related to image processing, sensor and hardware performance are rapidly developed, studies on unmanned automobiles have been going on competitively.
In fact, navigation using GPS has already become popular as a navigation service for beginners, and the accuracy of GPS data is gradually increasing, and the error range is also small, thereby increasing the reliability.
In this trend, unmanned automobiles have been operated or parked automatically using front-and-rear radar sensors, GPS, and so on.
Since these unmanned vehicles are inevitable trends, studies on them have been carried out in various directions, but until now, they have been mainly studied in the direction of unmanned operation of vehicles.
SUMMARY OF THE INVENTION The present invention has been made to solve the above problems of the prior art, and it is an object of the present invention to provide an unmanned smart car operating method using a human body signal analysis capable of analyzing a human body signal to inform a current state in real time, And an object of the present invention is to provide an apparatus.
According to an aspect of the present invention, there is provided a method for smart car navigation using human body signal analysis, comprising: sensing a human body signal of a user in a smart car in a human body signal sensing unit in an unmanned smart car; Sensing signal analysis unit in the unmanned smart car The biological signal sensed by the human body signal sensing unit is stored in the memory of the user's sex, age, preferred resort, preferred food style, and physical and mental psychological state analysis data Analyzing the data with at least one of the plurality of data: analyzing the analyzed data, the controller of the unmanned smart car guides the user to the uninfected smart car user; And controlling the unmanned smart car such that the unmanned smart car is operated in unattended operation through an unmanned operation unit for unattended operation of the unattended smart car.
The analyzed result of the sensed signal is guided through at least one of a voice conversation unit and an image output unit in the unmanned smart car, and guidance is provided for a plurality of persons when there is a plurality of persons.
In the case where the number of the unmanned smart car users is several, the control unit guides the plurality of resort sites suitable for the common part of the user in the memory unit. The resort guide is characterized by a resort area according to a plurality of search results, A voice conversation unit capable of voice conversation with the user, and a video output unit capable of video and audio output to the resort.
According to the analyzed data, the step of informing the user of the resort area to the unmanned smart car user includes a step of determining whether a rest is necessary according to the result of the analysis of the sensing signal analysis unit, and a step of, And retrieving a condition necessary for resting through the information providing server.
According to another aspect of the present invention, there is provided an unmanned smart car navigation system using human body signal analysis, comprising: a human body signal sensing unit for sensing a human body signal of an unmanned smart car; A communication unit for supporting wireless communication with an unmanned smart car and an unmanned smart car outside; A memory unit for storing basic data of an unmanned smart car user, general human body signal data, human body signal data measured by a human body signal sensing unit, recreation information data, and traffic information data; An unmanned operation unit that automatically runs an unmanned smart car to a destination; A sensing signal analyzing unit for analyzing the physical and psychological states of the unattended smart car user by comparing the human body signal sensed by the human body signal sensing unit with the human body signal data stored in the memory unit; And a sensing signal analyzing unit to provide relaxation information collected through at least one of the memory unit and the communication unit when the human body signal needs to be rested according to the human body signal analyzed and the unmanned smart car is automatically moved to the resort based on the recreation information through the unmanned driving unit And a control unit for controlling the display unit.
Here, the memory unit stores at least one of the gender, age, preferred resort, and preferred food style of the user in the unattended smart car, data on the physical and mental state of the person according to the human signal, music data suitable for the physical and mental state of the person, Information data, and traffic information data.
And the recreation information data is set to be periodically updated by receiving data from outside through the communication unit.
Meanwhile, the unmanned operation unit includes a position information receiving unit for receiving the current position information of the unmanned smart car, a traffic information receiving unit for receiving the traffic information to the destination in the unmanned smart car operation, And an automatic operation control unit for automatically driving to a destination.
In addition, the unmanned smart car device using the human body signal analysis includes a voice conversation section for conducting a conversation with a user in the unmanned smart car, and a display section for outputting suitable images and music according to the physical and psychological conditions of the user And a video output unit.
Meanwhile, the image output unit is configured to receive various touch commands of the user in the unmanned smart car using a touch sensor.
The wireless smart car communicates with a recreation information providing server for providing recreation information through a communication unit and a traffic information providing server for providing real time traffic information through a wired / wireless communication network.
The present invention has the following effects.
First, sensing and analyzing the human body signal in the unmanned smart car capable of unmanned operation can inform the current state in real time, guiding the suitable place to the mood, and automatically moving, so that the mood of the unattended smart car user can be optimized.
Second, unmanned smart car users can move comfortably with sufficient rest because they are smart cars that can operate unmanned.
Third, if there are several users of the unmanned smart car, the relationship between the unmanned smart car users can be improved when the human signals are analyzed to guide the rest.
1 is a block diagram for explaining an unmanned smart car device using human body signal analysis according to the present invention.
2 is a flowchart illustrating an unmanned smart car operation method using human body signal analysis according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
In addition, although the term used in the present invention is selected as a general term that is widely used at present, there are some terms selected arbitrarily by the applicant in a specific case. In this case, since the meaning is described in detail in the description of the relevant invention, It is to be understood that the present invention should be grasped as a meaning of a term that is not a name of the present invention. Further, in describing the embodiments, descriptions of technical contents which are well known in the technical field to which the present invention belongs and which are not directly related to the present invention will be omitted. This is for the sake of clarity of the present invention without omitting the unnecessary explanation.
1 is a block diagram for explaining an unmanned smart car device using human body signal analysis according to the present invention.
1, an unmanned smart car device using human body signal analysis according to the present invention includes a human body
Here, the human body
The
The
The
Here, the position
The traffic
The automatic
The sensing signal analyzer 150 compares the human body signal sensed by the human body
The voice conversation unit 160 performs a conversation with the user in the unattended smart car. This dialogue communicates sensed sensing results by voice, analyzes the voice inputted from the user, and conducts a conversation to perform conversation such as whether it is necessary to take a rest, guide the resort, or select a resort destination.
The video output unit 170 outputs suitable images and music according to the physical and psychological states of the sensed person. In addition, if there is image data of the resort, the image is provided so that the resort can be selected. The image output unit 170 may receive various touch commands of the driverless driver using the touch sensor. Such a touch command includes a command to select a resort, a music selection, and the like.
The
The
2 is a flowchart illustrating an unmanned smart car operation method using human body signal analysis according to the present invention.
As shown in FIG. 2, in the unmanned
Next, the sensing signal analysis unit 150 of the unattended
The analysis result analyzing the sensed signal guides the physical and mental state of the human body in the unattended
On the other hand, the
If it is determined in step S130 that a rest is required, the
In accordance with such a guidance, it is determined whether the unattended smart car user has selected one of the search results (S160).
If the unattended smart car user selects one of the plurality of search results, the
Meanwhile, various events (resting places, change of destination, etc.) may occur during the traveling (S180), and an event is processed when such an event occurs (S190). At this time, if the generated event is the destination cancellation, it waits for a new command.
On the other hand, when the user arrives at the resting place (S200), the user arrives at the resting place (S210).
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it should be understood that various changes and modifications will be apparent to those skilled in the art. It is to be understood that the present invention is not limited to the above-described embodiments. Accordingly, the scope of protection of the present invention should be construed according to the following claims, and all technical ideas which fall within the scope of equivalence by alteration, substitution, substitution, Range. In addition, it should be clarified that some configurations of the drawings are intended to explain the configuration more clearly and are provided in an exaggerated or reduced size than the actual configuration.
100: Unmanned smart car 110: Human body signal sensing unit
120: communication unit 130: memory unit
140: Unmanned operation unit 141: Position information receiving unit
142: traffic information reception unit 143: automatic operation control unit
150: sensing signal analysis unit 160:
170: Acoustic output unit 180:
200: Recreation information providing server 300: Traffic information providing server
400: communication network
Claims (12)
According to the analyzed data, the control unit of the unmanned smart car guides the resort to the user in the unattended smart car; And
Wherein the control unit of the unmanned smart car controls unmanned operation to the resort through an unmanned operation unit that operates the smart car unattended.
Wherein the analysis result of analyzing the sensed signal is guided through at least one of a voice conversation unit and an image output unit in the unattended smart car, A method of unmanned smart car using.
Wherein the control unit guides a plurality of resort areas suitable for a common part of the user in the memory unit when the number of the unattended smart car users is plural,
The resort guide is to guide a resort by a plurality of search results through a feature of the feature, a voice conversation unit capable of voice conversation with the user in the unattended smart car by pros and cons, and an image output unit capable of video and audio output to the resort Unmanned smart car operation method using human signal analysis.
Wherein the step of guiding the resort to the unmanned smart car user according to the analyzed data comprises:
Determining whether a rest is necessary according to an analysis result of the sensing signal analysis unit;
Wherein the control unit further comprises a step of searching for a condition necessary for resting through the memory unit and a recreation information providing server that provides recreation information when a rest is required.
A communication unit for supporting wireless communication between the unmanned smart car and the unmanned smart car outside;
A memory unit for storing basic data of the unmanned smart car user, general human body signal data, human body signal data measured by the human body signal sensing unit, recreation information data, and traffic information data;
An unattended operation unit for automatically operating the unmanned smart car to a destination;
A sensing signal analyzer for comparing the human body signal sensed by the human body signal sensing unit with the human body signal data stored in the memory unit to analyze the physical and psychological states of the unmanned smart car user; And
Wherein the control unit provides recreation information collected through at least one of the memory unit and the communication unit when a rest is required according to the human body signal analyzed by the sensing signal analyzing unit and transmits the recreation information through the unmanned operation unit to the resort- And a control unit for controlling the vehicle to automatically move the vehicle.
The memory unit may store at least one of the sex, age, preferred resort, and preferred food style of the user in the unattended smart car, analysis data of the physical and mental psychological state of the person according to the human body signal, music data suitable for the physical and mental state of the person, Wherein at least one of recreation information data and traffic information data is stored.
Wherein the recreation information data is set to be periodically updated by receiving data from the outside through the communication unit, and the unmanned smart car device using human body signal analysis.
Wherein the unmanned operation unit comprises:
A location information receiving unit for receiving current location information of the unattended smart car;
A traffic information receiver for receiving traffic information to the destination in the unmanned smart car,
And an automatic operation control unit for automatically operating the position of the current unattended smart car received by the position information receiving unit and the destination to the destination.
In the unmanned smart car using the human body signal analysis,
A voice conversation unit for conducting a conversation with the user in the unattended smart car,
And an image output unit for outputting suitable images and music according to physical and psychological conditions of the user who is sensing the unmanned smart car.
Wherein the image output unit is configured to receive various touch commands of the user in the unmanned smart car using a touch sensor.
Wherein the wireless smart car communicates via a wired / wireless communication network with a recreation information providing server for providing recreation information through the communication unit and a traffic information providing server for providing real time traffic information. .
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150068212A KR20160135067A (en) | 2015-05-15 | 2015-05-15 | Apparatus and method for operating autonomous smart car uging analysis bio signal |
PCT/KR2016/005042 WO2016186381A1 (en) | 2015-05-15 | 2016-05-12 | Method and apparatus for operating unmanned smart car using biometric signal analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150068212A KR20160135067A (en) | 2015-05-15 | 2015-05-15 | Apparatus and method for operating autonomous smart car uging analysis bio signal |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160135067A true KR20160135067A (en) | 2016-11-24 |
Family
ID=57320613
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150068212A KR20160135067A (en) | 2015-05-15 | 2015-05-15 | Apparatus and method for operating autonomous smart car uging analysis bio signal |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR20160135067A (en) |
WO (1) | WO2016186381A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106775649A (en) * | 2016-11-25 | 2017-05-31 | 深圳市保臻社区服务科技有限公司 | A kind of linguistic context guides the method and device of user |
CN107168313A (en) * | 2017-05-17 | 2017-09-15 | 北京汽车集团有限公司 | Control the method and device of vehicle drive |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004051074A (en) * | 2001-11-13 | 2004-02-19 | Equos Research Co Ltd | In-vehicle device, data preparation device, and data preparation program |
KR101091274B1 (en) * | 2005-01-19 | 2011-12-07 | 현대자동차주식회사 | Manless driving system using telematics and method thereof |
KR100800026B1 (en) * | 2007-01-29 | 2008-02-01 | 대한민국 | Method and system of servicing evacuation of an emergency case |
KR101982117B1 (en) * | 2013-04-30 | 2019-08-28 | 현대엠엔소프트 주식회사 | A human-bio sensing system using a sensor that is provided on the steering wheel of the car and its method of operation |
JP2015052493A (en) * | 2013-09-06 | 2015-03-19 | 株式会社 日立産業制御ソリューションズ | Navigation device and navigation system |
-
2015
- 2015-05-15 KR KR1020150068212A patent/KR20160135067A/en active Search and Examination
-
2016
- 2016-05-12 WO PCT/KR2016/005042 patent/WO2016186381A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2016186381A1 (en) | 2016-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2016052881A (en) | Travel control system for vehicle | |
WO2016035268A1 (en) | Travel control system for vehicle | |
KR102562227B1 (en) | Dialogue system, Vehicle and method for controlling the vehicle | |
CN105270292B (en) | System and method for controlling access to human-machine interface in vehicle | |
JP7172321B2 (en) | Driving evaluation device, driving evaluation system, driving evaluation method, and driving evaluation computer program | |
US9476729B2 (en) | Adaptive navigation and location-based services based on user behavior patterns | |
JP4609527B2 (en) | Automotive information provision system | |
US11191468B2 (en) | Vehicular notification apparatus | |
US20180033429A1 (en) | Extendable vehicle system | |
CN106467113A (en) | System and method for driver assistance | |
US10861460B2 (en) | Dialogue system, vehicle having the same and dialogue processing method | |
CN108430819A (en) | Car-mounted device | |
US11004450B2 (en) | Dialogue system and dialogue processing method | |
KR102403355B1 (en) | Vehicle, mobile for communicate with the vehicle and method for controlling the vehicle | |
JP2012112853A (en) | Information processor, in-vehicle navigation device and information processing method | |
US20200070815A1 (en) | Autonomous driving method and apparatus of vehicle, program and recording medium | |
KR20160135069A (en) | Artificial intelligence autonomous smart car and method for operating thereof | |
JP2006251298A (en) | Device and method for inputting voice | |
KR20160135067A (en) | Apparatus and method for operating autonomous smart car uging analysis bio signal | |
JP2018206187A (en) | Stop position determination device, stop position determination system, stop position determination method and program | |
JP2009059229A (en) | Operation support method and operation support system | |
JP6884605B2 (en) | Judgment device | |
CN110562260A (en) | Dialogue system and dialogue processing method | |
JP5293127B2 (en) | Driving assistance device | |
KR20140044036A (en) | Apparatus and method for providing automobile driving environment based on the driver sensibility and operation mode |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment |