CN113396382B - Auxiliary method and auxiliary system - Google Patents
Auxiliary method and auxiliary system Download PDFInfo
- Publication number
- CN113396382B CN113396382B CN201980091159.XA CN201980091159A CN113396382B CN 113396382 B CN113396382 B CN 113396382B CN 201980091159 A CN201980091159 A CN 201980091159A CN 113396382 B CN113396382 B CN 113396382B
- Authority
- CN
- China
- Prior art keywords
- display
- functional component
- vehicle
- augmented reality
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000003190 augmentative effect Effects 0.000 claims abstract description 60
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000006870 function Effects 0.000 description 65
- 238000001514 detection method Methods 0.000 description 23
- 238000004891 communication Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 8
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Traffic Control Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
An assistance method for assisting in grasping a function using an information terminal having a camera and a display includes: a first display step of displaying a captured image obtained by the camera on the display; a determination step of determining a functional component of the vehicle included in the captured image displayed on the display by the first display step; and a second display step of, when a user's intention to grasp a function related to the functional component specified by the specifying step is detected, displaying an augmented reality image indicating an operation state when the functional component is actually operated on the display so as to be superimposed on the captured image obtained by the camera.
Description
Technical Field
The present invention relates to an assistance method and assistance system for assisting in grasping a function.
Background
In recent years, vehicles have provided various functions, and users are required to grasp the functions. Patent document 1 discloses the following: in a portable radio communication terminal such as a smart phone, a captured image is displayed on a display, and a guide (name) of a component included in the captured image is displayed in an overlapping manner on the display; and when the guide of the component displayed in the display is pressed, displaying an operation manual of the component on the display.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2014-215845
Disclosure of Invention
Problems to be solved by the invention
As described in patent document 1, it is difficult for a user to easily grasp functions of a vehicle that are performed when the component is operated, such as what functions are performed when the component is operated, by merely displaying the directions of the component and an operation manual on a display.
Accordingly, an object of the present invention is to easily and intuitively grasp a function by a user.
Means for solving the problems
An assisting method according to an aspect of the present invention is an assisting method for assisting in grasping a function by using an information terminal having a camera and a display, the assisting method including: a first display step of displaying a captured image obtained by the camera on the display; a determination step of determining a functional component included in the captured image displayed on the display by the first display step; and a second display step of, when a user intention to grasp a function related to the functional component specified by the specifying step is detected, displaying an augmented reality image indicating an operation state when the functional component is actually operated on the display so as to be superimposed on the captured image obtained by the camera.
Effects of the invention
According to the present invention, for example, the user can recognize the actual function, and thus can grasp the function more easily and intuitively than before.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing the configuration of an auxiliary system.
Fig. 2 is a sequence diagram showing processing performed between the information terminal and the server apparatus.
Fig. 3 is a flowchart showing a process performed by the processing unit of the information terminal.
Fig. 4 is a view showing how a user photographs a steering wheel in a vehicle with a camera of an information terminal.
Fig. 5 is a diagram showing an example of a display for displaying a captured image of a steering wheel on an information terminal.
Fig. 6 is a diagram showing how a user can grasp the functions of a vehicle using an information terminal that displays an augmented reality image on a display.
Fig. 7 is a diagram showing an information terminal in which a captured image and an augmented reality image are displayed on a display.
Fig. 8 is a diagram showing an information terminal in which a captured image and an augmented reality image are displayed on a display.
Detailed Description
Hereinafter, embodiments will be described in detail with reference to the drawings. Note that the following embodiments are not limited to the configurations of the inventions in the claims, and combinations of features described in the embodiments are not necessarily all essential to the inventions. Two or more of the features described in the embodiments may be arbitrarily combined. The same or similar structures are denoted by the same reference numerals, and repetitive description thereof will be omitted.
< First embodiment >, first embodiment
A first embodiment of the present invention will be described. Fig. 1 is a block diagram showing the configuration of an assist system 100 according to the present embodiment. The support system of the present embodiment is a system for assisting a user in grasping a vehicle function, and may include an information terminal 10, a server device 20 (cloud), and a network NTW. Here, the vehicle may be, for example, a four-wheeled vehicle or a saddle-ride type vehicle (two-wheeled vehicle or three-wheeled vehicle), and in the present embodiment, the four-wheeled vehicle is exemplified as the vehicle. As the information terminal 10, for example, a smart phone, a tablet terminal, or the like is used, and in this embodiment, an example in which a tablet terminal is used as the information terminal 10 will be described. Smart phones and tablet terminals are portable terminals having various functions other than a call function, and the sizes of displays are different from each other. In general, a tablet terminal has a larger display size than a smart phone.
First, the configuration of the information terminal 10 will be described. The information terminal 10 may include, for example, a processing section 11, a storage section 12, a camera 13, a display 14, a position detection sensor 15, an attitude detection sensor 16, and a communication section 17. The respective parts of the information terminal 10 are communicably connected to each other via a system bus 18.
The processing unit 11 includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like. The storage unit 12 stores a program executed by a processor, data used by the processor in processing, and the like, and the processing unit 11 can read the program stored in the storage unit 12 to a storage device such as a memory and execute the program. In the present embodiment, the storage unit 12 stores an application program (auxiliary program) for assisting the user in grasping the vehicle function, and the processing unit 11 reads the auxiliary program stored in the storage unit 12 to a storage device such as a memory and executes the program.
The camera 13 includes a lens and an imaging element, and captures an object to obtain a captured image. The camera 13 may be provided on an outer surface on the opposite side of the outer surface on which the display 14 is provided, for example. In addition, the display 14 reports information to the user by displaying images. In the case of the present embodiment, the display 14 can display the captured image acquired by the camera 13 in real time. Here, the display 14 of the present embodiment includes, for example, a touch panel LCD (Liquid CRYSTAL DISPLAY) or the like, and has a function of receiving information input from a user in addition to a function of displaying an image. However, the present invention is not limited to this, and the display 14 may be provided with only a function of displaying an image, and an input unit (for example, a keyboard, a mouse, or the like) may be provided independently of the display 14.
The position detection sensor 15 detects the position and orientation of the information terminal 10. As the position detection sensor 15, for example, a GPS sensor that receives a signal from a GPS satellite to acquire position information of the information terminal 10 at the current time, an orientation sensor that detects an orientation of the information terminal 10, which is oriented by the camera 13, based on geomagnetism or the like, or the like is used. In the present embodiment, the term "position of the information terminal 10" includes the azimuth of the information terminal 10 in addition to the position of the information terminal 10. In addition, the posture detection sensor 16 detects the posture of the information terminal 10. As the posture detection sensor 16, for example, an acceleration sensor, a gyro sensor, or the like can be used.
The communication unit 17 is communicably connected to the server device 20 via the network NTW. Specifically, the communication unit 17 has a function as a receiving unit that receives information from the server apparatus 20 via the network NTW, and a function as a transmitting unit that transmits information to the server apparatus 20 via the network NTW. In the present embodiment, the communication unit 17 transmits information (operation information) indicating the type of the functional component operated by the user on the display 14 and the operation mode thereof to the server apparatus 20. The communication unit 17 may receive data representing an augmented reality (AR: augmented Reality) image of the operation state of the vehicle when the functional components of the vehicle are actually operated from the server device 20.
As a specific configuration of the processing unit 11, for example, a first acquisition unit 11a, a second acquisition unit 11b, a determination unit 11c, a detection unit 11d, and a display control unit 11e may be provided. The first acquisition unit 11a acquires data of a captured image obtained by the camera 13. The second acquisition unit 11b acquires data of the augmented reality image from the server device 20 via the communication unit 17. The determination unit 11c analyzes the captured image by performing image processing such as a pattern matching method, for example, and determines the functional components of the vehicle included in the captured image displayed on the display 14. The detection unit 11d detects a user's intention to grasp the function related to the functional component specified by the specification unit 11 c. In the present embodiment, the detection unit 11d detects that the user has operated the functional component specified by the specification unit 11c on the display 14 as the user's intention, but may be, for example, a system for detecting the user's voice, line of sight, or the like, or a system for detecting the operation of a button provided in the information terminal 10. The determination unit 11d transmits information indicating the type of the functional component whose operation is detected (meaning of the user grasping the function) and the operation mode of the functional component to the server apparatus 20 via the communication unit 17. The display control unit 11e displays the captured image acquired by the first acquisition unit 11a on the display 14. When the data of the augmented reality image is acquired by the second acquisition unit 11b, the display control unit 11e causes the display 14 to display the augmented reality image so as to be superimposed (superimposed) on the captured image based on the position and posture of the information terminal 10 detected by the position detection sensor 15 and the posture detection sensor 16, respectively.
Next, the structure of the server device 20 will be described. The server device 20 may include a processing section 21, a storage section 22, and a communication section 23. The processing unit 21 includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like. The storage unit 22 stores a program executed by a processor, data used by the processor in processing, and the like, and the processing unit 21 can read the program stored in the storage unit 22 to a storage device such as a memory and execute the program. The communication unit 23 is communicably connected to the information terminal 10 via the network NTW. Specifically, the communication unit 23 has a function as a receiving unit that receives information from the information terminal 10 via the network NTW, and a function as a transmitting unit that transmits information to the information terminal 10 via the network NTW.
In the case of the present embodiment, the storage unit 22 stores data representing an augmented reality image of an operation state (function) of the vehicle when the functional component of the vehicle is actually operated, for each of a plurality of types of functional components provided in the vehicle. The processing unit 21 receives information indicating the type of the functional component and the operation mode of the functional component detected by the processing unit 11 (the detecting unit 11 d) of the information terminal 10 from the information terminal 10 via the communication unit 23, and transmits data of the augmented reality image stored in the storage unit 22 in correspondence with the functional component to the information terminal 10 via the communication unit 23 based on the received information.
[ Processing sequence of auxiliary System ]
Next, a processing sequence of the assist system 100 will be described. Fig. 2 is a sequence diagram showing a process performed between the information terminal 10 and the server apparatus 20.
When the execution of the auxiliary program is started in the information terminal 10, the information terminal 10 starts shooting with the camera 13 (step 101), and the shot image obtained with the camera 13 is displayed on the display 14 in real time (step 102). In addition, the information terminal 10 starts detection of the position of the information terminal 10 by the position detection sensor 15 and detection of the posture of the information terminal 10 by the posture detection sensor 16 (step 103). Then, the information terminal 10 analyzes the captured image obtained by the camera 13, and identifies the functional components of the vehicle included in the captured image displayed on the display 14 (step 104). When detecting that the user has operated the specified functional component on the display 14 (step 105), the information terminal 10 transmits operation information indicating the operated functional component and its operation mode to the server apparatus 20 (step 106).
The storage unit 22 of the server device 20 stores data of an augmented reality image (for example, a moving image) indicating an operation state of the vehicle when the functional components are actually operated, for each of the plurality of functional components provided in the vehicle. The server device 20 that has received the operation information from the information terminal 10 selects data of an augmented reality image corresponding to the operation information (function component, operation mode) from among the plurality of augmented reality images stored in the storage unit 22 (step 107), and transmits the data of the selected augmented reality image to the information terminal 10 (step 108). The information terminal 10 displays the augmented reality image received from the server device 20 on the display 14 so as to overlap the captured image obtained by the camera 13 (step 109). At this time, the information terminal 10 aligns the captured image of the camera 13 with the augmented reality image based on the position of the information terminal 10 detected by the position detection sensor 15 and the posture of the information terminal 10 detected by the posture detection sensor 16 so that the augmented reality image moves in conformity with the operation of the information terminal 10.
[ Processing in information terminal ]
Next, a process performed by the information terminal 10 when executing the auxiliary program will be described. Fig. 3 is a flowchart showing a process performed by the processing unit 11 of the information terminal 10.
The vehicle is provided with various functions such as a function for use during driving, a function for improving comfort in the vehicle, a function for improving safety, and the like, and a functional member for performing these functions may be provided in the vehicle (in the vehicle). Examples of the function used during driving include a direction indicator (a direction indicator), a wiper blade, a hand brake, and a transmission. Examples of the function for improving comfort include an air conditioner, a seat heater, and a sound box. Further, as functions for improving safety, there are a cruise control (hereinafter referred to as ACC: adoptive Cruise Control) having a vehicle distance control, a lane keeping assist system (hereinafter referred to as LKAS: LANE KEEPING ASSISTANT SYSTEM), and the like. The assist system 100 of the present embodiment can be configured to assist a user in grasping various functions of the vehicle as described above, and an example of assisting in grasping safety functions such as ACC and LKAS will be described below. Here, it is assumed that a functional member (switch) for performing the functions of ACC and LKAS is provided to the steering wheel.
In S11, the processing section 11 (first acquisition section 11 a) causes the camera 13 to start shooting, and acquires a shot image from the camera 13. In S12, the processing unit 11 (display control unit 11 e) sequentially displays the captured images acquired from the camera 13 on the display 14. For example, fig. 4 shows how the user photographs the steering wheel 2 in the vehicle with the camera 13 of the information terminal 10. In this case, in the information terminal 10, the captured images of the steering wheel 2 obtained by the camera 13 are sequentially displayed on the display 14. Note that fig. 4 shows an in-vehicle environment that can be seen by a user riding a vehicle, and illustrates a front windshield 1, a front pillar 3, an instrument panel 4, and an instrument panel 5 in addition to a steering wheel 2.
In S13, the processing unit 11 (determining unit 11 c) determines the functional components of the vehicle included in the captured image displayed on the display 14. For example, the processing unit 11 first performs a known image process to identify the components included in the captured image. The storage unit 12 of the information terminal 10 stores feature information for each of a plurality of functional components provided in the vehicle, and the processing unit 11 checks (determines) whether or not there is a functional component having a high degree of coincidence with the features of the identified component (that is, a degree of coincidence exceeding a predetermined value). In this way, the processing unit 11 can identify the functional components included in the captured image.
In S14, the processing unit 11 determines whether or not the functional component has been determined within the captured image displayed on the display 14. S15 is entered when the functional component has been determined in the captured image, and S12 is returned to when the functional component has not been determined.
In S15, the processing unit 11 displays an augmented reality image indicating the name of the functional component specified in S13 on the display 14 so as to be superimposed on the captured image obtained by the camera 13. At this time, the processing unit 11 displays an augmented reality image indicating the name of the functional component on the display 14 so as to match the position of the functional component in the captured image displayed on the display 14, based on the information of the position and orientation of the information terminal 10 detected by the position detection sensor 15 and the orientation detection sensor 16. For example, fig. 5 shows an example in which a captured image of the steering wheel 2 is displayed on the display 14 of the information terminal 10. In the example shown in fig. 5, in the captured image of the steering wheel 2, the ACC switch 2a, LKAS switch 2b, cancel switch 2c, and setting switch 2d for the vehicle distance are specified, and the augmented reality images 31 to 34 indicating the names thereof are displayed on the display 14 so as to be superimposed on the captured image of the steering wheel.
In the present embodiment, the augmented reality image indicating the names of the respective plurality of functional components provided in the vehicle is stored in the storage unit 12, but the present invention is not limited to this, and may be stored in the storage unit 22 of the server apparatus 20. In this case, the processing unit 11 may transmit the identification information of the functional component specified in S12 to the server apparatus 20 via the communication unit 17, and may receive data of the augmented reality image indicating the name of the functional component from the server apparatus 20 via the communication unit 17.
In S16, the processing section 11 determines whether the user has operated the functional component determined in S13 on the display 14. In this embodiment, since the touch panel type display 14 is being used, the processing unit 11 determines whether or not the user's finger or the like has operated the functional component on the display. However, the present invention is not limited to this, and, for example, in the case of using a non-touch panel type display, it may be determined whether or not the functional component has been operated on the display via an input section such as a mouse. The process proceeds to S17 in the case where the user has operated the functional part on the display 14, and returns to S12 in the case where the functional part is not operated on the display 14.
In S17, the processing unit 11 transmits information (operation information) indicating the type of the functional component and the operation mode thereof, which has been operated by the user on the display 14, to the server apparatus 20 via the communication unit 17. In S18, the processing unit 11 (second acquisition unit 11 b) receives the augmented reality image corresponding to the operation information from the server device 20 via the communication unit 17 from among the plurality of augmented reality images stored in the storage unit 22 of the server device 20.
Here, in order to explain the vehicle function, the augmented reality image acquired in S18 is an augmented reality image (for example, an animation) representing the motion state of the vehicle when the functional component is actually operated, and may include, for example, at least one of a change in the vehicle state and a change in the environment around the vehicle when the functional component is actually operated. The "vehicle state" included in the augmented reality image is, for example, a state that virtually indicates an operation or a function performed by the vehicle when the functional component is actually operated, and may include a state that visualizes non-visual information appearing by actually operating the functional component. As the invisible information, for example, radio waves of millimeter wave radar, lidar, or the like emitted from the vehicle in the case where the ACC switch 2a is operated can be cited. The "vehicle surrounding environment" included in the augmented reality image is, for example, a state of a road, a lane line, a preceding vehicle, or the like that virtually changes around the vehicle when the functional component is actually operated.
In S19, the processing unit 11 (display control unit 11 e) displays the augmented reality image acquired in S18 on the display 14 so as to overlap the captured image obtained by the camera 13. At this time, the processing unit 11 aligns the captured image of the camera 13 with the augmented reality image obtained in S18 based on the information of the position and orientation of the information terminal 10 detected by the position detection sensor 15 and the orientation detection sensor 16 so that the augmented reality image moves in accordance with the operation of the information terminal 10.
Fig. 6 shows how the augmented reality image acquired in S18 is displayed on the display 14 of the information terminal 10, and the user is grasping the vehicle function by using the information terminal 10. Specifically, the user photographs the outside of the vehicle with the camera 13 of the information terminal 10 being located between the front windshield 1, and the augmented reality image acquired when the functional components are operated on the display 14 is displayed on the display 14 of the information terminal 10 so as to overlap with the photographed image of the camera 13.
For example, fig. 7 shows an example in which the augmented reality image obtained in S18 when the ACC switch 2a is operated on the display 14 is displayed on the display 14 of the information terminal 10 so as to be superimposed on the captured image of the camera 13. In the example shown in fig. 7, the road 41, the lane line 42, and the front vehicle 43 are displayed as augmented reality images on the display 14. Then, the display 14 displays an augmented reality image as a state of an electric wave 44 (for example, millimeter wave radar) emitted from the front of the vehicle when the ACC is irradiated to the front vehicle 43 to function, and a description 45 of the function of the ACC. Further, the instrument panel 4 displayed in the display 14 is a captured image captured by the camera 13. Such an augmented reality image is, for example, an animation, and such an augmented reality image may be displayed on the display 14 in such a manner that the appearance of the augmented reality image changes according to the position, direction, and posture of the information terminal 10. When the distance setting switch 2d is operated on the display 14, the distance from the front vehicle 43 displayed as the augmented reality image on the display 14 may be changed according to the distance set by the operation.
Fig. 8 shows an example in which the augmented reality image obtained in S18 when LKAS switch 2b is operated on display 14 is displayed on display 14 of information terminal 10 so as to be superimposed on the captured image of camera 13. In the example shown in fig. 8, the road 51 and the lane line 52 are displayed as augmented reality images on the display 14. The operation performed when the LKAS is made to function, specifically, a symbol 53 and a description 54 indicating that the vehicle is out of the lane line 52, and a symbol 55 and a description 56 for assisting the operation of the steering wheel 2 in this case, are displayed as an augmented reality image on the display 14. Note that the instrument panel 4 displayed in the display 14 is a captured image captured by the camera 13. Such an augmented reality image is, for example, an animation, similar to the example shown in fig. 7, and may be displayed on the display 14 so that the appearance of the augmented reality image changes according to the position, direction, and posture of the information terminal 10.
As described above, in the assist system 100 according to the present embodiment, when the user operates a functional component included in a captured image displayed on the display 14, an augmented reality image indicating the operation state of the vehicle when the functional component is actually operated is displayed on the display 14 so as to be superimposed on the captured image. Thus, the user can visually grasp how the vehicle functions when operating the functional components of the vehicle. That is, the assist system 100 according to the present embodiment allows the user to easily grasp the vehicle functions.
< Other embodiments >
In the above embodiment, the example in which the data of the augmented reality image is stored in the storage unit 22 of the server apparatus 20 has been described, but the present invention is not limited to this, and the data of the augmented reality image may be stored in the storage unit 12 of the information terminal 10. In this case, since it is not necessary to transmit and receive data between the information terminal 10 and the server apparatus 20, the above-described auxiliary program can be executed even in the offline information terminal 10.
In addition, in the above-described embodiment, the assist system 100 for assisting the user in grasping the functions of the vehicle has been described, but the assist system 100 may be used for assisting in grasping the functions of other objects, not limited to the functions of the vehicle. The object may be any object whose state changes according to the operation of the functional member, and the state change may be an electrical change or a mechanical change.
Summary of the embodiments
1. The assisting method of the above embodiment is an assisting method for assisting grasping of a function using an information terminal (e.g., 10) having a camera (e.g., 13) and a display (e.g., 14), in which,
The auxiliary method comprises the following steps:
a first display step of displaying a captured image obtained by the camera on the display;
A determination step of determining functional components (for example, 2a to 2 d) included in the captured image displayed on the display by the first display step; and
And a second display step of, when a user intention to grasp a function related to the functional component specified by the specifying step is detected, displaying an augmented reality image representing an operation state when the functional component is actually operated on the display so as to be superimposed on the captured image obtained by the camera.
According to this configuration, even if a manual relating to the functions and operations of the functional components is not checked, it is possible to visually and intuitively grasp how the functions are to be performed when the functional components are operated. That is, the user can virtually experience the function, and thus can easily grasp the function.
2. In the auxiliary method of the above-described embodiment,
In the second display step, at least one of a state of an object that functions when the functional component is actually operated and a surrounding environment of the object when the functional component is actually operated is displayed as the augmented reality image on the display.
According to this configuration, the function of the object can be grasped more visually, and therefore, the user can grasp the function of the object more easily.
3. In the auxiliary method of the above-described embodiment,
In the second display step, non-visual information appearing by actually operating the functional component is visualized to be displayed as the augmented reality image on the display.
According to this configuration, since it is possible to visually grasp what method functions, it is possible for the user to grasp the functions more easily.
4. In the auxiliary method of the above-described embodiment,
The invisible information includes electric waves emitted by actually operating the functional component.
According to this configuration, the radio wave emitted for the function is visually represented, so that the user can grasp the function more easily.
5. In the auxiliary method of the above-described embodiment,
In the second display step, as a user meaning to grasp a function related to the functional component, an operation of the functional component by a user on the display is detected.
With this configuration, it is possible to reliably detect that the user wants to grasp the function.
6. In the auxiliary method of the above-described embodiment,
The functional component is a component provided in the vehicle,
In the second display step, the augmented reality image representing the operation state of the vehicle when the functional component is actually operated is displayed on the display.
According to this configuration, even if a manual relating to the functions of the vehicle and the operations of the functional components is not checked, it is possible to visually and intuitively grasp how the vehicle functions are exhibited when the functional components are operated.
The present invention is not limited to the above embodiments, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Accordingly, to disclose the scope of the present invention, the following claims are appended.
Description of the reference numerals
10. An information terminal; 11. a processing section; 12. a storage unit; 13. a camera; 14. a display; 15. a position detection sensor; 16. a posture detection sensor; 17. a communication unit; 20. a server device; 21. a processing section; 22. a storage unit; 23. and a communication unit.
Claims (5)
1. An assisting method for assisting in grasping a function by using an information terminal having a camera and a display, characterized in that,
The auxiliary method comprises the following steps:
a first display step of displaying a captured image obtained by the camera on the display;
A determination step of determining a functional component included in the captured image displayed on the display by the first display step; and
A second display step of, when a user's intention to grasp a function related to the functional component specified by the specifying step is detected, displaying an augmented reality image representing an operation state at the time of actually operating the functional component and a description of the function of the functional component on the display so as to be superimposed on the captured image obtained by the camera,
In the second display step, non-visual information appearing by actually operating the functional part is visualized to be displayed as the augmented reality image on the display,
The functional component is a component provided in the vehicle,
The functional component comprises an ACC switch,
The non-visual information includes electric waves emitted by actually operating the functional parts,
The electric wave is an electric wave of a millimeter wave radar or a lidar emitted from the vehicle in a case where the ACC switch is operated,
In the second display step, a state in which the millimeter wave radar or the laser radar is irradiated toward the preceding vehicle is displayed on the display.
2. The auxiliary method according to claim 1, wherein,
In the second display step, at least one of a state of the vehicle functioning when the functional component is actually operated and a surrounding environment of the vehicle when the functional component is actually operated is displayed on the display as the augmented reality image,
The surrounding environment virtually represents a road, a lane, and a vehicle in front that change around the vehicle when the functional component is actually operated.
3. The auxiliary method according to claim 1, wherein,
In the second display step, as a user meaning to grasp a function related to the functional component, an operation of the functional component by a user on the display is detected.
4. An auxiliary method according to any one of claims 1 to 3, characterized in that,
In the second display step, the augmented reality image representing the operation state of the vehicle when the functional component is actually operated is displayed on the display.
5. An assist system for assisting in grasping a function by using an information terminal having a camera and a display, characterized in that,
The information terminal includes:
A first display means for displaying a captured image obtained by the camera on the display;
A determination means for determining a functional component included in the captured image displayed on the display by the first display means; and
A second display means for displaying, when a user's intention to grasp a function related to the functional component specified by the specifying means is detected, an augmented reality image representing an operation state at the time of actually operating the functional component and a description of the function of the functional component on the display so as to be superimposed on the captured image obtained by the camera,
The second display means displays on the display as the augmented reality image non-visual information appearing by actually operating the functional part,
The functional component is a component provided in the vehicle,
The functional component comprises an ACC switch,
The non-visual information includes electric waves emitted by actually operating the functional parts,
The electric wave is an electric wave of a millimeter wave radar or a lidar emitted from the vehicle in a case where the ACC switch is operated,
The second display means displays, on the display, a state in which the millimeter wave radar or the laser radar is irradiated to a preceding vehicle.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/014252 WO2020202345A1 (en) | 2019-03-29 | 2019-03-29 | Assistance method and assistance system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113396382A CN113396382A (en) | 2021-09-14 |
CN113396382B true CN113396382B (en) | 2024-09-10 |
Family
ID=72667270
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980091159.XA Active CN113396382B (en) | 2019-03-29 | 2019-03-29 | Auxiliary method and auxiliary system |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7117454B2 (en) |
CN (1) | CN113396382B (en) |
WO (1) | WO2020202345A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114758100B (en) * | 2022-04-07 | 2025-02-11 | Oppo广东移动通信有限公司 | Display method, device, electronic device and computer-readable storage medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160091168A (en) * | 2015-01-23 | 2016-08-02 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015043180A (en) * | 2013-08-26 | 2015-03-05 | ブラザー工業株式会社 | Image processing program |
KR102105463B1 (en) * | 2013-09-02 | 2020-04-28 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
JP2015118556A (en) * | 2013-12-18 | 2015-06-25 | マイクロソフト コーポレーション | Augmented reality overlay for control devices |
KR101979694B1 (en) * | 2016-11-04 | 2019-05-17 | 엘지전자 주식회사 | Vehicle control device mounted at vehicle and method for controlling the vehicle |
-
2019
- 2019-03-29 WO PCT/JP2019/014252 patent/WO2020202345A1/en active Application Filing
- 2019-03-29 JP JP2021511720A patent/JP7117454B2/en active Active
- 2019-03-29 CN CN201980091159.XA patent/CN113396382B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160091168A (en) * | 2015-01-23 | 2016-08-02 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
Also Published As
Publication number | Publication date |
---|---|
CN113396382A (en) | 2021-09-14 |
WO2020202345A1 (en) | 2020-10-08 |
JP7117454B2 (en) | 2022-08-12 |
JPWO2020202345A1 (en) | 2021-12-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2826689B1 (en) | Mobile terminal | |
US9317759B2 (en) | Driving assistance device and driving assistance method | |
US20110029185A1 (en) | Vehicular manipulation input apparatus | |
EP2603863A1 (en) | Method for supporting a user of a motor vehicle in operating the vehicle and portable communication device | |
CN105966311B (en) | Method for calibrating a camera, device for a vehicle and computer program product | |
JP5358277B2 (en) | Navigation system and mobile telephone | |
CN109963144B (en) | AR-HUD-based vehicle-mounted identification system | |
JP2014179097A (en) | Information query by pointing | |
US10296101B2 (en) | Information processing system, information processing apparatus, control method, and program | |
CN113396382B (en) | Auxiliary method and auxiliary system | |
US11978287B2 (en) | Information provision system, information terminal, and information provision method | |
CN114394111B (en) | Lane changing method for automatic driving vehicle | |
WO2013190802A1 (en) | On-vehicle map display device | |
JP2009190675A (en) | Operating device for vehicle | |
CN109189068B (en) | Parking control method and device and storage medium | |
JP6075298B2 (en) | Information processing apparatus and mobile terminal | |
JP2014215327A (en) | Information display apparatus, on-vehicle device and information display system | |
JP7044820B2 (en) | Parking support system | |
WO2019065699A1 (en) | Terminal device | |
US11734928B2 (en) | Vehicle controls and cabin interior devices augmented reality usage guide | |
JP7237149B2 (en) | Information provision system and information terminal | |
CN111355925B (en) | Control device, vehicle, image display system, and image display method | |
JP7095584B2 (en) | Vehicle display control device, vehicle display system, vehicle display control method and program | |
JP2006133454A (en) | Image display device | |
JP2018088205A (en) | Gesture recognition device for movable body, gesture recognition method for movable body and gesture recognition program for movable body |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |