WO2012133983A1 - 차량에 장착되는 영상표시기기에서의 이미지 처리 - Google Patents
차량에 장착되는 영상표시기기에서의 이미지 처리 Download PDFInfo
- Publication number
- WO2012133983A1 WO2012133983A1 PCT/KR2011/003092 KR2011003092W WO2012133983A1 WO 2012133983 A1 WO2012133983 A1 WO 2012133983A1 KR 2011003092 W KR2011003092 W KR 2011003092W WO 2012133983 A1 WO2012133983 A1 WO 2012133983A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- text
- mobile terminal
- vehicle
- speed
- Prior art date
Links
- 238000004891 communication Methods 0.000 claims abstract description 43
- 238000000034 method Methods 0.000 claims description 51
- 238000012015 optical character recognition Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 description 32
- 239000000284 extract Substances 0.000 description 16
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 238000010295 mobile communication Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000003909 pattern recognition Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000009304 pastoral farming Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3688—Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/26—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/85—Arrangements for transferring vehicle- or driver-related data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1468—Touch gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/148—Instrument input by voice
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/151—Instrument output devices for configurable output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/164—Infotainment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/167—Vehicle dynamics information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/184—Displaying the same information on different displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
- B60K2360/1868—Displaying information according to relevancy according to driving situations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/191—Highlight information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/563—Vehicle displaying mobile device information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/569—Vehicle controlling mobile device functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/583—Data transfer between instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/25—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
Definitions
- the present disclosure relates to an electronic device, and more particularly, to an image processing device and an image display device and a mobile terminal for providing the same in an image display device mounted on a vehicle.
- An image display device is a device having a function of outputting an image that a user can watch.
- the user can watch the image in the vehicle through the image display device mounted in the vehicle.
- the image display device mounted in a vehicle may receive an image that a user can watch through a wired or wireless connection from an external device in the vehicle such as a mobile terminal.
- the present disclosure extracts text from the screen image provided from the mobile terminal, and converts the extracted text into voice according to the speed of the vehicle and outputs the voice, so that the user can stably and conveniently check the image provided from the mobile terminal while driving.
- the purpose is to provide a video display device.
- the present specification extracts the text from the screen image, and converts the extracted text according to the speed of the vehicle into a voice provided with the screen image to the video display device mounted on the vehicle image provided from the mobile terminal while the user is driving
- An object of the present invention is to provide a mobile terminal that can be stably and conveniently checked through an image display device mounted on a vehicle.
- an image display device including a communication unit configured to receive an image from a mobile terminal.
- a display unit displaying the received image;
- a controller configured to acquire a text corresponding to the displayed image, obtain a speed of the vehicle, and convert the acquired text into voice when the speed exceeds a threshold speed;
- a sound output unit configured to output the converted voice.
- the communication unit may request the text corresponding to the displayed image from the mobile terminal and receive the requested text.
- the communication unit receives a message indicating that the displayed image is a text-based image from the mobile terminal, and when the controller receives the message, text corresponding to the displayed image on the mobile terminal. Characterized in that request.
- the display unit may display an indicator indicating that audio output of a text corresponding to the image is possible when the speed exceeds a threshold speed.
- the controller is configured to perform optical character recognition on the displayed image to extract text corresponding to the displayed image.
- the controller selects an audio output region from the displayed image and extracts text corresponding to the selected region.
- the display device may further include an input unit configured to receive an input for selecting the voice output area from the displayed image.
- the controller may detect a plurality of text areas in the displayed image and select a first area of the plurality of text areas as the voice output area.
- the controller may select the first area of the plurality of text areas as the voice output area, and then, as the voice output area, a second area different from the first area among the plurality of text areas. It is characterized by selecting.
- the control unit may select a second area different from the first area among the plurality of text areas as the voice output area together with the first area.
- the display unit may distinguish the voice output area from areas other than the voice output area in the displayed image.
- the apparatus may further include a microphone configured to receive a voice input, wherein the controller converts the voice input into text and determines at least one object corresponding to the converted text in the displayed image. It is done.
- the communication unit may transmit a control signal for the determined at least one object to the mobile terminal.
- the method may further include an input unit configured to receive an input for selecting any one of the at least one determined object, wherein the communication unit transmits a control signal for the selected object to the mobile terminal. do.
- a speed sensor for calculating a speed of the vehicle; And a GPS module for acquiring GPS information of the vehicle, wherein the controller acquires the speed of the vehicle from the speed sensor or obtains the speed of the vehicle based on the GPS information.
- a mobile terminal for realizing the above object, the control unit for obtaining a text corresponding to the screen image, the speed of the vehicle; And a communication unit which transmits the screen image to an image display apparatus mounted in the vehicle and transmits the obtained text together with the screen image to the image display apparatus when the speed exceeds a threshold speed. It is done.
- the control unit may obtain the text from the application when the application corresponding to the screen image is a text-based application.
- the communication unit may transmit a message indicating that the image is a text-based image to the video display device.
- the controller may be configured to perform optical character recognition on the image to extract text corresponding to the image when the application corresponding to the image is not a text-based application. It features.
- the communication unit may acquire GPS information of the mobile terminal, and the controller may acquire the speed of the vehicle based on the GPS information, or obtain the speed of the vehicle from the image display device. It features.
- the image display device disclosed herein provides an interface for visually or audibly checking information included in an image provided from a mobile terminal according to a traveling speed.
- a traveling speed a traveling speed
- the mobile terminal and the image display device disclosed herein there is an advantage that the user can stably and conveniently check the information included in the image provided from the mobile terminal through the image display device while driving.
- FIG. 1 is a diagram schematically illustrating an example of an entire image transmission system including a mobile terminal and an image display device according to an exemplary embodiment disclosed herein.
- FIG. 2 is a block diagram illustrating a configuration of a mobile terminal 100 according to embodiments disclosed herein.
- FIG. 3 is a block diagram showing the configuration of an image display device 200 for explaining the embodiments disclosed herein.
- FIG. 4 is a flowchart illustrating an operation control process of an image display device according to an exemplary embodiment disclosed herein.
- 5A and 5B are exemplary views illustrating an operation control process of the image display device according to the first embodiment disclosed herein.
- FIG. 6 is a detailed flowchart of the text acquiring process S120 illustrated in FIG. 4.
- FIG. 7A and 7B are exemplary views illustrating an operation control process of the image display apparatus according to the second embodiment disclosed herein.
- FIG. 8 is an exemplary diagram illustrating a process of selecting a region to be converted into speech by extracting text from a screen image according to the third embodiment of the present disclosure.
- FIG. 9A is an exemplary diagram illustrating a process of selecting a region to be converted into speech by extracting text from a screen image according to the fourth embodiment disclosed herein.
- FIG. 9B is an exemplary diagram illustrating a process of selecting a region to be converted into a voice by extracting text from a screen image according to the fourth embodiment disclosed herein.
- 10A and 10B are exemplary views illustrating a process of converting text extracted from a screen image according to a fifth embodiment of the present disclosure into voice and outputting the same.
- 11A and 11B are diagrams illustrating an object control process in a process of converting and extracting text extracted from a screen image according to a fifth embodiment of the present disclosure into voice.
- FIG. 12 is a flowchart illustrating an operation control process of a mobile terminal according to a sixth embodiment disclosed herein.
- module and “unit” for components used in the following description are merely given in consideration of ease of preparation of the present specification, and the “module” and “unit” may be used interchangeably with each other.
- FIG. 1 is a diagram schematically illustrating an example of an entire image transmission system including a mobile terminal and an image display device according to an exemplary embodiment disclosed herein.
- an image transmission system includes a mobile terminal 100 and an image display device 200.
- the mobile terminal 100 may be connected to the video display device 200 by wire or wirelessly to transmit at least one of an image and a voice to the video display device 200.
- the image display device 200 may be mounted in a vehicle fixed manner and connected to the mobile terminal 100 by wire or wirelessly to receive at least one of a screen image and a voice from the mobile terminal 100. In addition, the image display device 200 may output at least one of a screen image and an audio received from the mobile terminal 100.
- the image display device 200 may receive an input from a user, and transmit the received input to the mobile terminal 100. For example, when a user applies a touch input through a touch screen provided in the image display apparatus 200, the user recognizes the position of the point where the touch input is applied in the screen image, and transmits information about the recognized position to the mobile terminal ( 100).
- the mobile terminal 100 may determine that a touch event has occurred at a point where a touch input is applied, and perform an operation corresponding to the touch event. That is, the user may control the operation of the mobile terminal 100 using a touch screen, a hard key, or the like provided in the image display apparatus 200.
- a user runs a road guidance application (or dialing, phonebook, e-mail, video playback application, etc.) installed in the mobile terminal 100, and the mobile terminal 100 of the road guidance application is used.
- the execution image is transmitted to the image display apparatus 200 so that the execution image of the road guidance application is displayed on the image display apparatus 200.
- the user may view the execution image of the road guidance application on the large screen of the image display device 200 in place of the small screen of the mobile terminal 100.
- the user may hear the road guidance voice through the speaker provided in the vehicle in place of the speaker of the mobile terminal 100.
- the mobile terminal 100 may perform an operation on the corresponding menu.
- the mobile terminal 100 may transmit and output a result of performing an operation on the corresponding menu to the image display apparatus 200.
- the mobile terminal 100 and the image display device 200 may include a short-range communication standard such as Bluetooth, a wireless Internet standard such as Wi-Fi, and an external device interface standard such as a universal serial bus (USB). Can be connected.
- a short-range communication standard such as Bluetooth
- a wireless Internet standard such as Wi-Fi
- an external device interface standard such as a universal serial bus (USB).
- USB universal serial bus
- the mobile terminal 100 may be installed with a server application for providing a service at the request of the client, the client device for accessing the service provided by the server may be installed in the video display device 200.
- the server application of the mobile terminal 100 captures the screen of the mobile terminal 100 regardless of the application type of the mobile terminal and transmits the captured screen to the client application of the image display device 200.
- the server application controls the operation of the mobile terminal 100 based on the information about the event generated in the video display device 200 from the client application.
- the video display device 200 remotely controls the mobile terminal 100 in a Virtual Network Computing (VNC) method using a Remote Frame Buffer (RFB) protocol that provides remote access to graphical user interfaces.
- VNC Virtual Network Computing
- RFB Remote Frame Buffer
- Can be controlled by The VNC allows the mobile terminal 100 to transmit the screen update to the video display device 200 through the network, and transmits an input event generated in the video display device 200 to the mobile terminal 100.
- the mobile terminal 100 for example, an A2DP (Advanced Audio Distribution Profile), in particular a Bluetooth headset, that defines the sound quality of audio (stereo or mono) that can be streamed from the first device to the second device via a Bluetooth connection.
- the audio signal may be transmitted to the video display device 200, the headset or the hands-free according to the HSP (HeadSet Profile), in particular HFP (Hands-Free) applied to the vehicle hands-free kit.
- HSP HeadSet Profile
- HFP Heands-Free
- the mobile terminal 100 and the image display device 200 may exchange additional information based on separate protocols.
- the image display device 200 may provide vehicle state information such as vehicle driving information, speed information, fuel information, and the like to the mobile terminal 100.
- Some applications installed in the mobile terminal 100 may use vehicle state information received from the image display device 200 using a separate protocol.
- such applications may include the type of application (eg, directions, multimedia, games, etc.) and the type of graphical user interface (GUI) (eg, map, video, menu, etc.) in the video display device 200.
- GUI graphical user interface
- Information about the application such as the state of the application (e.g., running in the foreground or background).
- the mobile terminal (mobile phone) 100 may be implemented in various forms.
- the mobile terminal 100 includes a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), and the like.
- PDA personal digital assistant
- PMP portable multimedia player
- the mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, and an output unit 150. , A memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like. Not all components of the mobile terminal 100 illustrated in FIG. 2 are essential components, and the mobile terminal 100 may be implemented by more components than those illustrated in FIG. 2, and fewer components thereof. The mobile terminal 100 may also be implemented.
- the wireless communication unit 110 may include one or more components for performing wireless communication between the mobile terminal 100 and the wireless communication system or wireless communication between the mobile terminal 100 and the network in which the mobile terminal 100 is located.
- the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .
- the broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
- the broadcast channel may include a satellite channel and a terrestrial channel.
- the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a pre-generated broadcast signal and / or broadcast related information and transmits the same to the mobile terminal 100.
- the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
- the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
- the broadcast related information may be provided through a mobile communication network, and in this case, may be received by the mobile communication module 112.
- the broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
- EPG Electronic Program Guide
- DMB Digital Multimedia Broadcasting
- ESG Electronic Service Guide
- DVD-H Digital Video Broadcast-Handheld
- the broadcast receiving module 111 receives broadcast signals using various broadcasting systems, and in particular, digital multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), and media forward link (MediaFLO). Digital broadcast signals can be received using digital broadcasting systems such as only), digital video broadcast-handheld (DVB-H), integrated services digital broadcast-terrestrial (ISDB-T), and the like. Of course, the broadcast receiving module 111 is configured to be suitable for all broadcast systems providing broadcast signals as well as the digital broadcast system described above. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.
- DMB-T digital multimedia broadcasting-terrestrial
- DMB-S digital multimedia broadcasting-satellite
- MediaFLO media forward link
- Digital broadcast signals can be received using digital broadcasting systems such as only), digital video broadcast-handheld (DVB-H), integrated services digital broadcast-terrestrial (ISDB-T), and the like.
- the broadcast receiving module 111 is configured
- the mobile communication module 112 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
- the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, and / or a text / multimedia message.
- the wireless internet module 113 refers to a module for wireless internet access, and the wireless internet module 113 may be embedded or external to the mobile terminal 100.
- wireless Internet technologies include WLAN (Wireless LAN), Wi-Fi, WiBro, WiMAX, World Interoperability for Microwave Access (Wimax), HSDPA (High Speed Downlink Packet Access), and the like. Can be used.
- the short range communication module 114 means a module for short range communication.
- Bluetooth Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, Wireless LAN (Bluetooth, 802.11n, etc. protocol) may be used. .
- RFID Radio Frequency Identification
- IrDA infrared data association
- UWB Ultra Wideband
- ZigBee Wireless LAN (Bluetooth, 802.11n, etc. protocol) may be used. .
- the position information module 115 is a module for checking or obtaining the position of the mobile terminal (possible to check the position of the vehicle when the mobile terminal is mounted on the vehicle).
- the GPS module receives location information from a plurality of satellites.
- the location information may include coordinate information represented by latitude and longitude.
- the GPS module can measure the exact time and distance from three or more satellites and accurately calculate the current position by triangulating three different distances.
- a method of obtaining distance and time information from three satellites and correcting the error with one satellite may be used.
- the GPS module can obtain not only the location of latitude, longitude, and altitude but also accurate time together with three-dimensional speed information from the location information received from the satellite.
- a Wi-Fi Positioning System and / or a Hybrid Positioning System may be applied as the location information module 115.
- the A / V input unit 120 is for inputting an audio signal or a video signal, and the A / V input unit 120 includes a camera 121 and a microphone 122. May be included.
- the camera 121 processes an image frame such as a still image or a video obtained by an image sensor in a video call mode or a photographing mode.
- the processed image frame may be displayed on the display unit 151.
- the image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be configured according to the configuration of the mobile terminal.
- the microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
- the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output.
- the microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
- the user input unit 130 generates input data for the user to control the operation of the mobile terminal.
- the user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.
- a touch pad static pressure / capacitance
- a jog wheel a jog switch
- a touch screen a touch screen
- the sensing unit 140 may be a mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, a presence or absence of a user contact, an orientation of the mobile terminal 100, an acceleration / deceleration of the mobile terminal 100, or the like.
- the sensing state of the mobile terminal 100 generates a sensing signal for controlling the operation of the mobile terminal 100.
- the sensing unit 140 is responsible for sensing functions related to whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to an external device, and the like.
- the interface unit 170 serves as an interface with all external devices connected to the mobile terminal 100.
- the interface unit 170 may include a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device equipped with an identification module, An audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port may be configured.
- the identification module is a chip that stores various information for authenticating the use authority of the mobile terminal 100, and includes a user identification module (UIM) and a subscriber identify module (SIM). , Universal Subscriber Identity Module (“USIM”), and the like.
- the device equipped with the identification module may be manufactured in the form of a smart card. Therefore, the identification module may be connected to the mobile terminal 100 through a port.
- the interface unit 170 may receive data from an external device or receive power and transmit the data to each component inside the mobile terminal 100 or transmit data within the mobile terminal 100 to an external device.
- the output unit 150 is for outputting an audio signal, a video signal, or an alarm signal.
- the output unit 150 includes a display unit 151, an audio output module 152, an alarm unit 153, and the like. This may be included.
- the display unit 151 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in a call mode, the mobile terminal 100 displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, or a GUI.
- UI user interface
- GUI graphic user interface
- the display unit 151 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, or a three-dimensional display. It may include at least one of (3D display). In addition, two or more display units 151 may exist according to the implementation form of the mobile terminal 100. For example, an external display unit (not shown) and an internal display unit (not shown) may be simultaneously provided in the mobile terminal 100.
- the display unit 151 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter referred to as a touch screen)
- the display unit 151 outputs the touch screen.
- the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
- the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated at a specific portion of the display unit 151 into an electrical input signal.
- the touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.
- the corresponding signal (s) is sent to a touch controller (not shown).
- the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can determine which area of the display unit 151 is touched.
- the proximity sensor 141 may be disposed in the inner region of the mobile terminal 100 surrounded by the touch screen or near the touch screen.
- the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using mechanical force by using an electromagnetic force or infrared rays.
- the proximity sensor 141 has a longer life and higher utilization than a contact sensor.
- Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
- the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by a change in an electric field according to the proximity of the pointer.
- the touch screen may be classified as a proximity sensor.
- the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as "Proximity Touch", and the touch The act of actually touching the pointer on the screen is called “Contact Touch”.
- the position of the proximity touch with the pointer on the touch screen means a position where the pointer is perpendicular to the touch screen when the pointer is in proximity touch.
- the proximity sensor 141 detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). do. Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
- a proximity touch and a proximity touch pattern for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
- the sound output module 152 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. In addition, the sound output module 152 outputs a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100.
- the sound output module 152 may include a speaker, a buzzer, and the like.
- the alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100.
- Examples of events occurring in the mobile terminal include call signal reception, message reception, and key signal input.
- the alarm unit 153 may output a signal for notifying occurrence of an event in a form other than an audio signal or a video signal.
- the signal may be output in the form of vibration.
- the alarm unit 153 may vibrate the mobile terminal through a vibration means.
- the key signal is input, the alarm unit 153 may vibrate the mobile terminal 100 through the vibration means in response to the key signal input.
- the user can recognize the occurrence of the event through the vibration as described above.
- the signal for notification of event occurrence may be output through the display unit 151 or the voice output module 152.
- the haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154.
- the intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.
- the haptic module 154 in addition to the vibration, the pin arrangement to move vertically with respect to the contact skin surface, the blowing force or suction force of the air through the injection or inlet, the grazing to the skin surface, the contact of the electrode (eletrode), electrostatic force and the like
- Various tactile effects can be produced, such as the effects of the heat-absorbing effect and the effect of reproducing a sense of cold using the elements capable of absorbing heat or generating heat.
- the haptic module 154 may not only deliver the haptic effect through direct contact, but may also be implemented to allow the user to feel the haptic effect through a muscle sense such as a finger or an arm.
- the haptic module 154 may be provided with two or more according to the configuration aspect of the telematics terminal.
- the haptic module 154 may be provided where the contact with the user is frequent in the vehicle. For example, it may be provided in a steering wheel, a shift gear lever, a seat seat, and the like.
- the memory 160 may store a program for processing and controlling the controller 180 and temporarily stores input / output data (for example, map data, phonebook, message, still image, video, etc.). It can also perform a function for.
- input / output data for example, map data, phonebook, message, still image, video, etc.
- the memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory).
- the mobile terminal 100 may operate a web storage that performs a storage function of the memory 150 on the Internet.
- the interface unit 170 serves as an interface with all external devices connected to the mobile terminal 100.
- the interface unit 170 may include a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device equipped with an identification module, An audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port may be configured.
- the identification module is a chip that stores various information for authenticating the use authority of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identify module (SIM), and universal user authentication. And a module (Universal Subscriber Identity Module (USIM)).
- UIM user identification module
- SIM subscriber identify module
- USB Universal Subscriber Identity Module
- the device equipped with the identification module (hereinafter referred to as an identification device) may be manufactured in the form of a smart card. Therefore, the identification module may be connected to the mobile terminal 100 through a port.
- the interface unit 170 may receive data from an external device or receive power and transmit the data to each component inside the mobile terminal 100 or transmit data within the mobile terminal 100 to an external device.
- the interface unit 170 may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various commands input by the user from the cradle. It may be a passage through which a signal is transmitted to the mobile terminal 100. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
- the controller 180 typically controls the overall operation of the mobile terminal 100.
- the controller 180 performs related control and processing for voice call, data communication, video call, and the like.
- the controller 180 may include a multimedia module 181 for multimedia playback.
- the multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.
- the controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on a touch screen as text and an image, respectively.
- the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
- the functions of the components applied to the mobile terminal 100 may be implemented in a computer-readable recording medium using software, hardware or a combination thereof.
- Hardware implementations include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPCs), and controllers (FPGAs). It may be implemented using at least one of controllers, micro-controllers, microprocessors, electrical units for performing functions, etc. In some cases such embodiments are implemented by the controller 180.
- embodiments such as a procedure or function may be implemented with a separate software module that performs at least one function or operation
- Software code may be written in a software application written in a suitable programming language. Software code is also stored in the memory 160 and controlled. It may be executed by the unit 180.
- the voice recognition module 182 recognizes the voice spoken by the user and performs a corresponding function according to the recognized voice signal.
- the navigation session 300 applied to the mobile terminal 100 displays a driving route on map data.
- FIG. 3 is a block diagram showing the configuration of an image display device 200 for explaining the embodiments disclosed herein.
- the image display device 200 includes a controller (eg, a central processing unit, a CPU) 212 for controlling the overall operation of the image display device 200, and a process of the controller 212. And a memory 213 for storing a program for controlling and input / output data, a key controller 211 for controlling various key signals, and an LCD controller 214 for controlling a liquid crystal display (LCD).
- the main board 210 is included.
- the memory 213 may store map information (map data) for displaying road guidance information on a digital map.
- the memory 213 may store a traffic information collection control algorithm for inputting traffic information according to a road situation in which the vehicle is currently driving and information for controlling the algorithm.
- the main board 210 is assigned a unique device number to receive GPS signals for code division multiple access (CDMA) module 206 embedded in the vehicle, location guidance of the vehicle, driving route tracking from the starting point to the destination, and the like.
- GPS module 207 for transmitting traffic information collected by a user as a GPS (Global Positioning System) signal, a CD deck (208) for reproducing a signal recorded on a compact disk (CD), a gyro sensor (gyro sensor) 209 and the like.
- the CDMA module 206 and the GPS module 207 may transmit and receive signals through the antennas 204 and 205.
- the broadcast receiving module 222 may be connected to the main board 210 and receive a broadcast signal through the antenna 223.
- the main board 210 includes a display unit (201) under the control of the LCD control unit 214 through the interface board 203, a front board 202 under the control of the key control unit 211, and a vehicle.
- a camera 227 for capturing the inside and / or the outside may be connected.
- the display unit 201 displays various video signals and text signals, and the front board 202 is provided with buttons for inputting various key signals, and the main board 210 provides a key signal corresponding to a user-selected button.
- the display unit 201 includes the proximity sensor and touch sensor (touch screen) of FIG. 2.
- the front board 202 may include a menu key for directly inputting traffic information, and the menu key may be configured to be controlled by the key controller 211.
- the audio board 217 is connected to the main board 210 and processes various audio signals.
- the audio board 217 includes a microcomputer 219 for controlling the audio board 217, a tuner 218 for receiving a radio signal, a power supply unit 216 for supplying power to the microcomputer 219, and various voices. It consists of a signal processor 215 for processing a signal.
- the audio board 217 is composed of a radio antenna 220 for receiving a radio signal, and a tape deck 221 for reproducing an audio tape.
- the audio board 217 may further include a voice output unit (eg, an amplifier) 226 for outputting a voice signal processed by the audio board 217.
- the voice output unit (amplifier) 226 is connected to the vehicle interface 224. That is, the audio board 217 and the main board 210 are connected to the vehicle interface 224.
- the vehicle interface 224 may be connected to a hands-free 225a for inputting a voice signal, an airbag 225b for occupant safety, a speed sensor 225c for detecting the speed of the vehicle, and the like.
- the speed sensor 225c calculates a vehicle speed and provides the calculated vehicle speed information to the central processing unit 212.
- the navigation session 300 applied to the image display device 200 generates road guide information based on map data and vehicle current location information, and notifies the user of the generated road guide information.
- the display unit 201 detects a proximity touch in the display window through a proximity sensor. For example, the display unit 201 detects the position of the proximity touch when the pointer (for example, a finger or a stylus pen) is in close proximity, and displays the position information corresponding to the detected position. Output to the control unit 212.
- the pointer for example, a finger or a stylus pen
- the voice recognition device (or voice recognition module) 301 recognizes the voice spoken by the user and performs a corresponding function according to the recognized voice signal.
- the navigation session 300 applied to the image display device 200 displays a driving route on map data, and the distance of the mobile terminal 100 is set in advance from the blind spot included in the driving route.
- a terminal e.g., a vehicle navigation device
- a mobile terminal carried by a nearby pedestrian through wireless communication (e.g., a short range wireless communication network).
- wireless communication e.g., a short range wireless communication network.
- the main board 210 may be connected to the interface unit 230, and the interface unit 230 may include an external device interface unit 231 and a network interface unit 232.
- the external device interface unit 231 may connect the external device to the image display device 200.
- the external device interface unit 231 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
- the external device interface unit 231 may be connected to an external device such as a digital versatile disk (DVD), a Blu-ray, a game device, a camera, a camcorder, a computer (laptop), or the like by wire or wireless.
- the external device interface unit 231 transmits an image, audio, or data signal input from the outside to the controller 212 of the image display device 200 through a connected external device.
- the image, audio or data signal processed by the controller 212 may be output to the connected external device.
- the external device interface unit 231 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).
- the A / V input / output unit may use a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), and a DVI to input video and audio signals from an external device to the video display device 200.
- CVBS Composite Video Banking Sync
- component terminal an S-video terminal (analog)
- DVI Digital Visual Interface
- HDMI High Definition Multimedia Interface
- RGB terminal High Definition Multimedia Interface
- D-SUB terminal D-SUB terminal and the like.
- the wireless communication unit may perform near field communication with another electronic device.
- the image display device 200 may include, for example, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Digital Living Network Alliance. It can be networked with other electronic devices according to communication standards.
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- UWB Ultra Wideband
- ZigBee ZigBee
- Digital Living Network Alliance Digital Living Network Alliance
- the external device interface unit 231 may be connected through at least one of various set-top boxes and various terminals to perform input / output operations with the set-top box.
- the external device interface unit 231 may receive an application or a list of applications in a neighboring external device and transmit the received application or application list to the controller 212 or the memory 213.
- the network interface unit 232 provides an interface for connecting the video display device 200 to a wired / wireless network including an internet network.
- the network interface unit 232 may include, for example, an Ethernet terminal for connection with a wired network, and for example, for connection with a wireless network, for example, a wireless LAN (WLAN) (Wi-).
- WLAN wireless LAN
- Fi Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) communication standards, and the like can be used.
- the network interface unit 232 may transmit or receive data with another user or another electronic device through a connected network or another network linked to the connected network.
- some content data stored in the image display apparatus 200 may be transmitted to another user who is registered in advance in the image display apparatus 200 or a user selected among other electronic apparatuses or a selected electronic apparatus.
- the network interface unit 232 can access a predetermined web page through a connected network or another network linked to the connected network. That is, by accessing a predetermined web page through the network, it is possible to send or receive data with the server.
- content or data provided by a content provider or a network operator may be received. That is, content such as a movie, an advertisement, a game, a VOD, a broadcast signal, and related information provided from a content provider or a network provider may be received through a network.
- the network interface unit 232 may select and receive a desired application from among applications that are open to the public through the network.
- FIG. 4 is a flowchart illustrating an operation control process of an image display device according to an exemplary embodiment disclosed herein.
- the interface unit 230 may receive an image from the mobile terminal 100 connected through the external device interface unit 231 or the network interface unit 232 (S110). For example, the interface unit 230 receives a frame buffer including values for all pixels to be displayed on the screen of the mobile terminal 100 from the mobile terminal 100.
- the image may be a screen image corresponding to the screen of the mobile terminal 100.
- the screen image may be a standby screen of the mobile terminal 100, a lock screen, or an image corresponding to an application running in the mobile terminal 100.
- the display unit 201 may display an image received from the mobile terminal 100 under the control of the LCD controller 214.
- the controller 212 may resize the image received from the mobile terminal 100 and control the LCD controller 214 to display the resized image on the display 201.
- the controller 212 may acquire text corresponding to the image received from the mobile terminal 100 (S120).
- the controller 212 may extract optical text from an image received from the mobile terminal 100 by performing optical character recognition.
- the controller 212 may request the mobile terminal 100 for text corresponding to the image received from the mobile terminal 100 and receive the requested text from the mobile terminal 100.
- the mobile terminal 100 may perform optical character recognition to extract text corresponding to the requested image from the image display device 200.
- the mobile terminal 100 may determine whether an application corresponding to the requested image is a text-based application such as a web browser or e-book, and if the application is a text-based application, obtain text through the corresponding application. have.
- the controller 180 may obtain text from data stored in an area accessible by an application in the memory 160.
- the controller 212 may acquire the speed of the vehicle (S130).
- the speed sensor 225c calculates the speed of the vehicle, and provides the calculated vehicle speed information to the controller 212, so that the controller 212 may obtain the speed of the vehicle.
- the GPS module 207 acquires the location information of the vehicle, and the controller 212 may directly calculate the speed of the vehicle from the change in the location of the vehicle over time based on the acquired location information of the vehicle.
- the controller 212 compares the speed of the vehicle with the threshold speed, and determines whether to enter the text to speech (TTS) mode based on the comparison result (S140). That is, when the speed of the vehicle exceeds the threshold speed (for example, the speed at which the vehicle moves at 0 mile / h, the driving regulation speed such as 5 m / h, etc.) stored in the memory 213, the controller 212 enters the TTS mode. Enter. If the speed of the vehicle does not exceed the threshold speed stored in the memory 213, the control unit 212 maintains the normal mode.
- the threshold speed for example, the speed at which the vehicle moves at 0 mile / h, the driving regulation speed such as 5 m / h, etc.
- the controller 212 converts the obtained text into voice (S150).
- the controller 212 may convert text into speech using a text to speech (TTS) engine.
- TTS text to speech
- the voice output unit 226 outputs the converted voice (S160).
- 5A and 5B are exemplary views illustrating an operation control process of the image display device according to the first embodiment disclosed herein.
- the interface unit 230 may receive a screen image from the mobile terminal 100.
- the interface unit 230 receives a frame buffer corresponding to the screen image from the mobile terminal 100, and the display unit 201 displays each pixel at a corresponding position on the screen according to the received frame buffer. Can be.
- the mobile terminal 100 may or may not display the screen image 400 on the screen of the mobile terminal 100 according to a setting.
- the display unit 201 displays the screen image 500 of the mobile terminal 100 received from the mobile terminal 100 on the screen, and performs optical character recognition on the screen image 500 displayed on the screen. You can extract the text.
- the controller 212 may determine the speed of the vehicle and compare it with the threshold speed. When the speed of the vehicle exceeds the threshold speed, the controller 212 may enter the TTS mode. In this case, the display unit 201 may display the indicator 502 indicating that the image display device 200 is in the TTS mode.
- the controller 212 converts the extracted text into a voice using a text to speech (TTS) engine and outputs the converted voice through the voice output unit 226.
- TTS text to speech
- the image display device 200 may provide an interface for receiving an input for selecting an area to extract text from the screen image 500 displayed on the display unit 201 and converting the text into voice.
- the user may drag a region on the screen of the display unit 201, and the display unit 201 may detect the dragged region using a touch sensor.
- the controller 212 may select the detected region 510 and extract text by performing optical character recognition on the selected region in the screen image.
- FIG. 6 is a detailed flowchart of the text acquiring process S120 illustrated in FIG. 4.
- the controller 212 may request a text corresponding to the image received from the mobile terminal 100 from the mobile terminal 100 (S121).
- controller 180 of the mobile terminal 100 may obtain a type of an application corresponding to the image according to the request (S122).
- the controller 180 may obtain a text corresponding to the image from the memory 160 through the application (S124).
- the controller 180 may acquire optical text by performing optical character recognition on the image (S125).
- the wireless communication unit 110 may transmit the obtained text to the image display device 200 (S126).
- FIG. 7A and 7B are exemplary views illustrating an operation control process of the image display apparatus according to the second embodiment disclosed herein.
- the interface unit 230 may receive a screen image from the mobile terminal 100.
- the interface unit 230 receives a frame buffer corresponding to the screen image from the mobile terminal 100, and the display unit 201 displays each pixel at a corresponding position on the screen according to the received frame buffer.
- the mobile terminal 100 may or may not display the screen image 400 on the screen of the mobile terminal 100 according to a setting.
- the interface unit 230 may request text corresponding to the screen image from the mobile terminal 100 and receive the requested text from the mobile terminal 100.
- the controller 212 may determine the speed of the vehicle and compare it with the threshold speed. When the speed of the vehicle exceeds the threshold speed, the controller 212 may enter the TTS mode. In this case, the display unit 201 may display the indicator 502 indicating that the image display device 200 is in the TTS mode.
- the controller 212 converts the received text into voice using a text to speech (TTS) engine and outputs the converted voice through the voice output unit 226. can do.
- TTS text to speech
- the image display device 200 may provide an interface for receiving an input for selecting an area to extract text from the screen image 500 displayed on the screen and converting the text into voice.
- the user may drag a region on the screen of the display unit 201, and the display unit 201 may detect the dragged region using a touch sensor.
- the interface unit 230 requests text corresponding to the detected area from the mobile terminal 100 (in this case, the interface unit 230 may transmit information about the detected area to the mobile terminal 100). ), The requested text may be received from the mobile terminal 100.
- FIG. 8 is an exemplary diagram illustrating a process of selecting a region to be converted into speech by extracting text from a screen image according to the third embodiment of the present disclosure.
- the display unit 201 may display the image 600 received by the interface unit 230 from the mobile terminal 100.
- the user may drag a region in the region where the image of the display unit 201 is displayed.
- the display unit 201 may detect the dragged area using the touch sensor.
- the controller 212 may acquire text in the detected area 610, and convert the obtained text into voice and output the text.
- FIG. 9A is an exemplary diagram illustrating a process of selecting a region to be converted into speech by extracting text from a screen image according to the fourth embodiment disclosed herein.
- the display unit 201 may display the image 600 received by the interface unit 230 from the mobile terminal 100.
- the controller 212 may image the image 600 displayed on the display unit 201 to extract a plurality of blocks. For example, the controller 212 may extract each block having a similar pattern from the displayed image by using an image pattern recognition algorithm. Each of the blocks may include at least one text.
- the controller 212 may extract three blocks 622, 624, and 626 from the image 600 displayed on the display unit 201. Then, the priority of each block may be determined based on the position displayed on the display unit 201. For example, block 622 is indicated at the top of the three blocks 622, 624, and 626, so that the highest priority can be given. And since block 624 is displayed to the left of block 624, the next higher priority may be given. As a result, first rank is given to block 622, second rank to block 624, and third rank to block 626.
- the controller 212 may extract text from blocks according to priority, and convert the extracted text into voice and output the same. For example, text is first extracted at block 622, and the extracted text is converted to speech and output. Next, the text is extracted at block 624, and the extracted text is converted to speech and output. Finally, text is extracted at block 626, and the extracted text is converted to speech and output.
- the controller 212 stops converting the text extracted in the block 622 into voice and outputs the voice.
- the next block 624 the text is extracted, and the extracted text is converted into voice and output.
- an input for selecting a previous block from the user for example, saying “previous” in speech or being provided in the front board 202
- the controller 212 stops converting the text extracted in the block 624 into voice and outputs the voice.
- the text is extracted, and the extracted text is converted into voice and output.
- FIG. 9B is an exemplary diagram illustrating a process of selecting a region to be converted into a voice by extracting text from a screen image according to the fourth embodiment disclosed herein.
- the display unit 201 may display the image 600 received by the interface unit 230 from the mobile terminal 100.
- the user may drag a region in the region where the image 600 of the display unit 201 is displayed.
- the display unit 201 may detect the dragged area using the touch sensor.
- the controller 212 designates the detected area 631 as an area to extract text and convert it into voice.
- the controller 212 may further include an area 632 adjacent to the designated area 631 using an image pattern recognition algorithm and further including an area having a pattern similar to the designated area 631. Extract the text and designate it as an area to be converted into speech.
- the controller 212 further includes an area adjacent to the designated area 632 and having a pattern similar to the designated area 632 using an image pattern recognition algorithm. 633 is designated as an area to extract text and convert it to speech.
- the regions 632, 633, 634, and 635 may be sequentially extended from the region 631 to regions to extract and convert text into speech. If the user inputs a voice output command (eg, voice is spoken out, or the voice output key is provided on the front board 202, the voice output menu is provided on the display unit 201). If there is no input or a command to expand the region within a predetermined time), the controller 212 extracts the text from the region designated as the region to be converted to speech by extracting the text at that time, and extracts the extracted text. The voice is converted and output.
- a voice output command eg, voice is spoken out, or the voice output key is provided on the front board 202, the voice output menu is provided on the display unit 201). If there is no input or a command to expand the region within a predetermined time), the controller 212 extracts the text from the region designated as the region to be converted to speech by extracting the text at that time, and extracts the extracted text. The voice is converted and output.
- 10A and 10B are exemplary views illustrating a process of converting text extracted from a screen image according to a fifth embodiment of the present disclosure into voice and outputting the same.
- the display unit 201 may display an area in which the text is extracted from the screen image 600 by the controller 212, and the extracted text is converted into voice and output from the other area.
- the display unit 201 may extract the text by the controller 212, and may display the extracted text by converting the extracted text into a voice by applying an animation effect.
- the display unit 201 enlarges an area 642 in which text is extracted by the control unit 212, the extracted text is converted into speech, and is output, and the enlarged area 644 is centered on the screen. It can be marked to be located at.
- the display unit 201 may display an area 652 in which text is extracted by the controller 212, and the extracted text is converted into speech and output as a highlight.
- 11A and 11B are diagrams illustrating an object control process in a process of converting and extracting text extracted from a screen image according to a fifth embodiment of the present disclosure into voice.
- the display unit 201 may display an image 700 received by the interface unit 230 from the mobile terminal 100.
- the controller 212 may extract text from the displayed image 700, convert the extracted text into voice, and output the converted text.
- an input for selecting a text object from the displayed image 700 may be received from the user (for example, when a voice refers to text included in the text object (“up and down girls”)).
- the controller 212 converts a voice received from the user into text using a speech to text (STT) engine or the like.
- the controller 212 searches for the converted text in the text extracted from the displayed image 700.
- the controller 212 generates a control event at a point corresponding to the text object including the found text.
- the text object 712 including the searched text may be displayed to be distinguished from other graphic objects in the image 700 displayed on the display unit 201.
- the interface unit 230 converts the generated control event into a signal and transmits the signal to the mobile terminal 100, and the mobile terminal 100 performs an operation related to the text object based on the signal received from the interface unit 230. .
- the interface unit 230 receives the screen image 800 of the mobile terminal 100 according to the execution of the operation, and the display unit 201 receives the screen image 800 received by the interface unit 230. Display.
- the display unit 201 may display an image 700 received by the interface unit 230 from the mobile terminal 100.
- the controller 212 may extract text from the displayed image 700, convert the extracted text into voice, and output the converted text.
- an input for selecting a text object from the displayed image 700 may be received from the user (for example, when a voice refers to text included in the text object (“Shanghai”)).
- the controller 212 converts a voice received from the user into text using a speech to text (STT) engine or the like.
- the controller 212 searches for the converted text in the text extracted from the displayed image 700.
- the display unit 201 displays the plurality of objects 722, 724, and 726 in the displayed image 700 to be distinguished from the objects in the displayed image 700.
- each of the plurality of objects 722, 724, and 726 may be distinguished from each other.
- the display unit 201 associates the indicators 1, 2, and 3 with each of the plurality of objects 722, 724, and 726 to distinguish each of the plurality of objects 722, 724, and 726. I can display it.
- the indicator may be an indicator that can be spoken using numbers or letters.
- the controller 212 may receive an input for selecting one object from among the plurality of objects from the user. For example, the user may say “No. 1” by voice or touch the position corresponding to No. 1 on the display unit 201.
- the controller 212 generates a control event at a point corresponding to the selected text object.
- the interface unit 230 converts the generated control event into a signal and transmits the signal to the mobile terminal 100, and the mobile terminal 100 performs an operation related to the text object based on the signal received from the interface unit 230. .
- the interface unit 230 receives the screen image 800 of the mobile terminal 100 according to the execution of the operation, and the display unit 201 receives the screen image 800 received by the interface unit 230. Display.
- FIG. 12 is a flowchart illustrating an operation control process of a mobile terminal according to a sixth embodiment disclosed herein.
- the controller 180 obtains text corresponding to the screen image (S210).
- the controller 180 determines whether the application corresponding to the screen image is a text-based application such as a web browser or e-book, and if the application is a text-based application, obtains text from data accessible from the application. Can be. In addition, when the corresponding application is not a text-based application, the controller 180 may extract optical text corresponding to the screen image by performing optical character recognition on the screen image.
- a text-based application such as a web browser or e-book
- the controller 180 may extract optical text corresponding to the screen image by performing optical character recognition on the screen image.
- the controller 180 obtains the speed of the vehicle (S220).
- the location information module 115 may obtain GPS information of the mobile terminal, and the controller 180 may directly calculate the speed of the vehicle according to a change in the location of the vehicle over time.
- the wireless communication unit 110 may receive the speed of the vehicle from the image display device 220.
- the speed sensor 225c calculates the speed of the vehicle and provides the calculated vehicle speed information to the controller 212.
- the GPS module 207 may obtain location information of the vehicle, and the controller 212 may directly calculate the speed of the vehicle according to the change in the location of the vehicle over time.
- the controller 212 may provide the obtained vehicle speed to the controller 180 through the interface unit 230.
- the controller 180 compares the vehicle speed with the threshold speed (S230).
- the wireless communication unit 110 may be used.
- the screen image transmits the screen image along with the text to the image display device 200 (S240).
- the wireless communication unit 110 transmits the screen image to the image display device 200 (S250).
- the display unit 201 displays a screen image received from the mobile terminal 100.
- the controller 212 converts the text received from the mobile terminal 100 into speech using a text to speech (TTS) engine.
- the voice output unit 226 outputs the converted voice through a speaker, a Bluetooth headset, or the like.
- the operation control method of the image display device can be implemented as code that can be read by the processor on a processor-readable recording medium provided in the image display device 200.
- the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet. .
- the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.
Landscapes
- Engineering & Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Controls And Circuits For Display Device (AREA)
- Telephone Function (AREA)
- Navigation (AREA)
Abstract
Description
Claims (20)
- 차량에 장착되는 영상표시기기로서,이동 단말기로부터 이미지를 수신하는 통신부;상기 수신한 이미지를 표시하는 디스플레이부;상기 표시된 이미지에 대응하는 텍스트를 획득하고, 상기 차량의 속도를 획득하고, 상기 속도가 임계 속도를 초과한 경우에 상기 획득한 텍스트를 음성으로 변환하는 제어부; 및상기 변환된 음성을 출력하는 음향 출력부를 포함하는 것을 특징으로 하는 영상표시기기.
- 제1 항에 있어서, 상기 통신부는,상기 이동 단말기에 상기 표시된 이미지에 대응하는 텍스트를 요청하고, 상기 요청된 텍스트를 수신하는 것인 영상표시기기.
- 제2 항에 있어서, 상기 통신부는,상기 이동 단말기로부터 상기 표시된 이미지가 텍스트 기반의 이미지임을 나타내는 메시지를 수신하고,상기 제어부는,상기 메시지를 수신한 경우에 상기 이동 단말기에 상기 표시된 이미지에 대응하는 텍스트를 요청하는 것인 영상표시기기.
- 제1 항에 있어서, 상기 디스플레이부는,상기 속도가 임계 속도를 초과하는 경우에, 상기 이미지에 대응하는 텍스트의 음성 출력이 가능함을 나타내는 인디케이터를 표시하는 것인 영상표시기기.
- 제1 항에 있어서, 상기 제어부는,상기 표시된 이미지에 광학 문자 인식(optical character recognition)을 수행하여, 상기 표시된 이미지에 대응하는 텍스트를 추출하는 것인 영상표시기기.
- 제1 항에 있어서, 상기 제어부는,상기 표시된 이미지에서 음성 출력 영역을 선택하고, 상기 선택된 영역에 대응하는 텍스트를 추출하는 것인 영상표시기기.
- 제6 항에 있어서, 상기 표시된 이미지에서 상기 음성 출력 영역을 선택하는 입력을 수신하는 입력부를 더 포함하는 것을 특징으로 하는 영상표시기기.
- 제6 항에 있어서, 상기 제어부는,상기 표시된 이미지에서 복수의 텍스트 영역을 검출하고, 상기 복수의 텍스트 영역 중 제1 영역을 상기 음성 출력 영역으로 선택하는 것인 영상표시기기.
- 제8 항에 있어서, 상기 제어부는,상기 복수의 텍스트 영역 중 상기 제1 영역을 상기 음성 출력 영역으로 선택한 후에, 상기 복수의 텍스트 영역 중 상기 제1 영역과 상이한 제2 영역을 상기 음성 출력 영역으로 선택하는 것인 영상표시기기.
- 제8 항에 있어서, 상기 제어부는,상기 복수의 텍스트 영역 중 상기 제1 영역과 상이한 제2 영역을 상기 제1 영역과 함께 상기 음성 출력 영역으로 선택하는 것인 영상표시기기.
- 제6 항에 있어서, 상기 디스플레이부는,상기 표시된 이미지에서 상기 음성 출력 영역을 상기 음성 출력 영역 이외의 영역과 구분하여 표시하는 것인 영상표시기기.
- 제1 항에 있어서, 음성 입력을 수신하는 마이크를 더 포함하고,상기 제어부는,상기 음성 입력을 텍스트로 변환하고, 상기 표시된 이미지에서 상기 변환된 텍스트에 대응하는 적어도 하나의 오브젝트를 판단하는 것인 영상표시기기.
- 제12 항에 있어서, 상기 통신부는,상기 판단된 적어도 하나의 오브젝트에 대한 제어 신호를 상기 이동 단말기에 송신하는 것인 영상표시기기.
- 제12 항에 있어서, 상기 판단된 적어도 하나의 오브젝트 중 어느 하나를 선택하는 입력을 수신하는 입력부를 더 포함하고,상기 통신부는,상기 선택된 오브젝트에 대한 제어 신호를 상기 이동 단말기에 송신하는 것인 영상표시기기.
- 제1 항에 있어서, 상기 차량의 속도를 산출하는 속도 센서; 및상기 차량의 GPS 정보를 획득하는 GPS 모듈을 더 포함하고,상기 제어부는,상기 속도 센서로부터 상기 차량의 속도를 획득하거나, 상기 GPS 정보에 기초하여 상기 차량의 속도를 획득하는 것인 영상표시기기.
- 화면 이미지에 대응하는 텍스트를 획득하고, 차량의 속도를 획득하는 제어부; 및상기 화면 이미지를 상기 차량에 장착되는 영상표시기기에 송신하고, 상기 속도가 임계 속도를 초과하는 경우에 상기 획득된 텍스트를 상기 화면 이미지와 함께 상기 영상표시기기에 송신하는 통신부를 포함하는 것을 특징으로 하는 이동 단말기.
- 제16 항에 있어서, 상기 제어부는,상기 화면 이미지에 대응하는 애플리케이션이 텍스트 기반의 애플리케이션인 경우에 상기 애플리케이션으로부터 상기 텍스트를 획득하는 것인 이동 단말기.
- 제16 항에 있어서, 상기 통신부는,상기 화면 이미지에 대응하는 애플리케이션이 텍스트 기반의 애플리케이션인 경우에, 상기 영상표시기기에 상기 이미지가 텍스트 기반의 이미지임을 나타내는 메시지를 송신하는 것인 이동 단말기.
- 제16 항에 있어서, 상기 제어부는,상기 이미지에 대응하는 애플리케이션이 텍스트 기반의 애플리케이션이 아닌 경우에 상기 이미지에 광학 문자 인식(optical character recognition)을 수행하여, 상기 이미지에 대응하는 텍스트를 추출하는 것인 이동 단말기.
- 제16 항에 있어서, 상기 통신부는,상기 이동 단말기의 GPS 정보를 획득하고,상기 제어부는,상기 GPS 정보에 기초하여 상기 차량의 속도를 획득하거나, 상기 영상표시기기로부터 상기 차량의 속도를 획득하는 것인 이동 단말기.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020137025867A KR101525842B1 (ko) | 2011-03-25 | 2011-04-27 | 차량에 장착되는 영상표시기기에서의 이미지 처리 |
EP11862239.8A EP2689969B1 (en) | 2011-03-25 | 2011-04-27 | Image processing in image displaying device mounted on vehicle |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161467819P | 2011-03-25 | 2011-03-25 | |
US61/467,819 | 2011-03-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012133983A1 true WO2012133983A1 (ko) | 2012-10-04 |
Family
ID=46876875
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2011/003092 WO2012133983A1 (ko) | 2011-03-25 | 2011-04-27 | 차량에 장착되는 영상표시기기에서의 이미지 처리 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8907773B2 (ko) |
EP (1) | EP2689969B1 (ko) |
KR (1) | KR101525842B1 (ko) |
WO (1) | WO2012133983A1 (ko) |
Families Citing this family (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9990674B1 (en) | 2007-12-14 | 2018-06-05 | Consumerinfo.Com, Inc. | Card registry systems and methods |
US8312033B1 (en) | 2008-06-26 | 2012-11-13 | Experian Marketing Solutions, Inc. | Systems and methods for providing an integrated identifier |
US10489434B2 (en) | 2008-12-12 | 2019-11-26 | Verint Americas Inc. | Leveraging concepts with information retrieval techniques and knowledge bases |
KR101686170B1 (ko) * | 2010-02-05 | 2016-12-13 | 삼성전자주식회사 | 주행 경로 계획 장치 및 방법 |
US20150138300A1 (en) | 2011-09-02 | 2015-05-21 | Microsoft Technology Licensing, Llc | Mobile Video Calls |
US9106691B1 (en) | 2011-09-16 | 2015-08-11 | Consumerinfo.Com, Inc. | Systems and methods of identity protection and management |
US8738516B1 (en) | 2011-10-13 | 2014-05-27 | Consumerinfo.Com, Inc. | Debt services candidate locator |
EP2597838B1 (en) * | 2011-11-28 | 2018-09-05 | Lg Electronics Inc. | Mobile terminal and image display apparatus mounted in a car |
US9836177B2 (en) | 2011-12-30 | 2017-12-05 | Next IT Innovation Labs, LLC | Providing variable responses in a virtual-assistant environment |
US9853959B1 (en) | 2012-05-07 | 2017-12-26 | Consumerinfo.Com, Inc. | Storage and maintenance of personal data |
US9916514B2 (en) * | 2012-06-11 | 2018-03-13 | Amazon Technologies, Inc. | Text recognition driven functionality |
US9039341B2 (en) * | 2012-07-20 | 2015-05-26 | Tyrone Soklaski | System and apparatus for improved wheelchair lift |
JP5734260B2 (ja) * | 2012-11-08 | 2015-06-17 | 本田技研工業株式会社 | 車両用表示装置 |
JP2014094615A (ja) * | 2012-11-08 | 2014-05-22 | Honda Motor Co Ltd | 車両表示装置 |
US9654541B1 (en) | 2012-11-12 | 2017-05-16 | Consumerinfo.Com, Inc. | Aggregating user web browsing data |
US9916621B1 (en) | 2012-11-30 | 2018-03-13 | Consumerinfo.Com, Inc. | Presentation of credit score factors |
JP6089361B2 (ja) * | 2012-12-14 | 2017-03-08 | アルパイン株式会社 | 車載機および車載システムならびに表示制御方法 |
US9672822B2 (en) * | 2013-02-22 | 2017-06-06 | Next It Corporation | Interaction with a portion of a content item through a virtual assistant |
CN104035696B (zh) * | 2013-03-04 | 2017-12-19 | 观致汽车有限公司 | 车载消息中心在触控显示界面上的显示方法及装置 |
US10685487B2 (en) * | 2013-03-06 | 2020-06-16 | Qualcomm Incorporated | Disabling augmented reality (AR) devices at speed |
US9406085B1 (en) | 2013-03-14 | 2016-08-02 | Consumerinfo.Com, Inc. | System and methods for credit dispute processing, resolution, and reporting |
US10102570B1 (en) | 2013-03-14 | 2018-10-16 | Consumerinfo.Com, Inc. | Account vulnerability alerts |
KR20140147329A (ko) * | 2013-06-19 | 2014-12-30 | 삼성전자주식회사 | 락 스크린을 표시하는 전자 장치 및 그 제어 방법 |
CN103738266B (zh) * | 2013-08-21 | 2017-07-18 | 深圳市辂元科技有限公司 | 移动终端与车机的互联方法及一种车机 |
US10054463B2 (en) | 2013-09-26 | 2018-08-21 | Google Llc | Systems and methods for providing navigation data to a vehicle |
US9109917B2 (en) * | 2013-09-26 | 2015-08-18 | Google Inc. | Systems and methods for providing input suggestions via the head unit of a vehicle |
US9958289B2 (en) | 2013-09-26 | 2018-05-01 | Google Llc | Controlling navigation software on a portable device from the head unit of a vehicle |
US9477737B1 (en) | 2013-11-20 | 2016-10-25 | Consumerinfo.Com, Inc. | Systems and user interfaces for dynamic access of multiple remote databases and synchronization of data based on user rules |
CN104850173A (zh) * | 2014-02-17 | 2015-08-19 | 深圳市爱美得科技有限公司 | 一种可扩展的移动通讯终端 |
US9381813B2 (en) * | 2014-03-24 | 2016-07-05 | Harman International Industries, Incorporated | Selective message presentation by in-vehicle computing system |
KR20150137799A (ko) * | 2014-05-30 | 2015-12-09 | 엘지전자 주식회사 | 이동 단말기 및 그 제어 방법 |
KR101579292B1 (ko) * | 2014-08-29 | 2015-12-21 | 서울대학교 산학협력단 | 범용 음성인식 제어 장치 및 제어 방법 |
KR102309175B1 (ko) * | 2014-08-29 | 2021-10-06 | 삼성전자주식회사 | 스크랩 정보를 제공하는 전자 장치 및 그 제공 방법 |
KR101700714B1 (ko) * | 2014-09-17 | 2017-01-31 | 현대자동차주식회사 | 사용자 인터페이스 장치, 그를 가지는 차량 및 그의 제어 방법 |
KR102232583B1 (ko) * | 2015-01-08 | 2021-03-26 | 삼성전자주식회사 | 전자장치 및 전자장치의 웹 재현 방법 |
KR102266660B1 (ko) * | 2015-01-20 | 2021-06-21 | 인포뱅크 주식회사 | 차량 인터페이스를 제어하는 휴대용 단말기 및 그 동작 방법 |
DE102015005235B4 (de) * | 2015-04-24 | 2017-02-16 | Audi Ag | Verfahren zum Fernsteuern einer Kraftfahrzeug-Anzeigeeinrichtung sowie Steuersystem hierzu |
WO2016209954A1 (en) * | 2015-06-23 | 2016-12-29 | Google Inc. | Mobile geographic application in automotive environment |
KR101689621B1 (ko) * | 2015-07-22 | 2016-12-27 | (주)피타소프트 | 이중 와이파이 접속 기능을 구비한 차량용 영상 기록 장치 및 이를 이용한 영상 공유 시스템 |
KR102407295B1 (ko) * | 2015-10-30 | 2022-06-10 | 현대오토에버 주식회사 | 데이터 미러링 방법 |
KR20170080797A (ko) * | 2015-12-30 | 2017-07-11 | 삼성디스플레이 주식회사 | 차량용 디스플레이 시스템 |
KR20170081953A (ko) * | 2016-01-05 | 2017-07-13 | 삼성전자주식회사 | 영상 표시 장치 및 그 동작방법 |
CN106095485A (zh) * | 2016-05-31 | 2016-11-09 | 惠州华阳通用电子有限公司 | 一种基于双机互动的移动终端模式切换方法与装置 |
KR101882198B1 (ko) * | 2016-11-01 | 2018-07-26 | 현대자동차주식회사 | 차량 및 그 제어방법 |
EP3598727B1 (en) * | 2017-04-11 | 2023-10-25 | Huawei Technologies Co., Ltd. | Message acquisition method and apparatus |
US10824870B2 (en) * | 2017-06-29 | 2020-11-03 | Accenture Global Solutions Limited | Natural language eminence based robotic agent control |
CN108215798B (zh) * | 2017-12-29 | 2019-10-18 | 上海友衷科技有限公司 | 一种车内界面显示方法及系统 |
JP2019137357A (ja) | 2018-02-15 | 2019-08-22 | トヨタ自動車株式会社 | 車両用音出力及び文字表示装置 |
US20200074100A1 (en) | 2018-09-05 | 2020-03-05 | Consumerinfo.Com, Inc. | Estimating changes to user risk indicators based on modeling of similarly categorized users |
CN111107304A (zh) * | 2018-10-25 | 2020-05-05 | 上海博泰悦臻电子设备制造有限公司 | 车载支架、移动终端、车辆及移动终端人脸跟踪方法 |
KR20200050150A (ko) | 2018-11-01 | 2020-05-11 | 현대자동차주식회사 | 블록체인 기반의 교통 정보 처리 방법 및 시스템 |
US11315179B1 (en) | 2018-11-16 | 2022-04-26 | Consumerinfo.Com, Inc. | Methods and apparatuses for customized card recommendations |
US11238656B1 (en) * | 2019-02-22 | 2022-02-01 | Consumerinfo.Com, Inc. | System and method for an augmented reality experience via an artificial intelligence bot |
US11080014B2 (en) * | 2019-02-28 | 2021-08-03 | Xevo Inc. | System and method for managing multiple applications in a display-limited environment |
US11941065B1 (en) | 2019-09-13 | 2024-03-26 | Experian Information Solutions, Inc. | Single identifier platform for storing entity data |
GB2594053A (en) * | 2020-04-08 | 2021-10-20 | Continental Automotive Gmbh | Method of displaying information and a display system |
KR102426950B1 (ko) * | 2020-08-03 | 2022-07-29 | 주식회사 옐로나이프 | 자전거 라이더의 위치와 시선에 기반한 광고 컨텐츠 제공 방법 및 장치 |
CN114765027A (zh) * | 2021-01-15 | 2022-07-19 | 沃尔沃汽车公司 | 用于车辆语音控制的控制设备、车载系统和方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1091389A (ja) * | 1996-09-09 | 1998-04-10 | Matsushita Electric Ind Co Ltd | テキストを音声に変換するシンセサイザ |
KR100674809B1 (ko) * | 2004-08-24 | 2007-01-26 | 엘지전자 주식회사 | Av 시스템과 이동통신 단말기를 이용한 이동체용네비게이션 시스템 |
KR100832805B1 (ko) * | 2007-07-16 | 2008-05-27 | 차상환 | 차량의 운행상태 및 외부입력데이터 표시장치 |
JP2010061137A (ja) * | 2008-09-03 | 2010-03-18 | Honda Motor Co Ltd | 自動車用途のための可変テキスト読み上げ |
KR20100070092A (ko) * | 2008-12-17 | 2010-06-25 | 정관선 | 터치스크린을 구비하는 이종기기의 디스플레이를 입출력장치로 이용하기 위한 핸드폰 구동방법 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6154658A (en) * | 1998-12-14 | 2000-11-28 | Lockheed Martin Corporation | Vehicle information and safety control system |
DE10046155A1 (de) * | 2000-09-15 | 2002-04-11 | Deutsche Telekom Mobil | Anzeigevorrichtung für per Mobilfunk übermittelte Daten |
US20030095688A1 (en) * | 2001-10-30 | 2003-05-22 | Kirmuss Charles Bruno | Mobile motor vehicle identification |
CA2656425C (en) * | 2006-06-29 | 2014-12-23 | Google Inc. | Recognizing text in images |
EP2229576B1 (en) * | 2007-12-05 | 2016-04-13 | Visteon Global Technologies, Inc. | Vehicle user interface systems and methods |
US20090195513A1 (en) * | 2008-02-05 | 2009-08-06 | Delphi Technologies, Inc. | Interactive multimedia control module |
-
2011
- 2011-04-27 WO PCT/KR2011/003092 patent/WO2012133983A1/ko active Application Filing
- 2011-04-27 KR KR1020137025867A patent/KR101525842B1/ko active IP Right Grant
- 2011-04-27 EP EP11862239.8A patent/EP2689969B1/en active Active
- 2011-08-25 US US13/218,384 patent/US8907773B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1091389A (ja) * | 1996-09-09 | 1998-04-10 | Matsushita Electric Ind Co Ltd | テキストを音声に変換するシンセサイザ |
KR100674809B1 (ko) * | 2004-08-24 | 2007-01-26 | 엘지전자 주식회사 | Av 시스템과 이동통신 단말기를 이용한 이동체용네비게이션 시스템 |
KR100832805B1 (ko) * | 2007-07-16 | 2008-05-27 | 차상환 | 차량의 운행상태 및 외부입력데이터 표시장치 |
JP2010061137A (ja) * | 2008-09-03 | 2010-03-18 | Honda Motor Co Ltd | 自動車用途のための可変テキスト読み上げ |
KR20100070092A (ko) * | 2008-12-17 | 2010-06-25 | 정관선 | 터치스크린을 구비하는 이종기기의 디스플레이를 입출력장치로 이용하기 위한 핸드폰 구동방법 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2689969A4 * |
Also Published As
Publication number | Publication date |
---|---|
KR101525842B1 (ko) | 2015-06-09 |
US20120242473A1 (en) | 2012-09-27 |
EP2689969A1 (en) | 2014-01-29 |
KR20130141672A (ko) | 2013-12-26 |
EP2689969B1 (en) | 2017-09-13 |
EP2689969A4 (en) | 2016-05-04 |
US8907773B2 (en) | 2014-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012133983A1 (ko) | 차량에 장착되는 영상표시기기에서의 이미지 처리 | |
WO2012133982A1 (ko) | 영상처리장치 및 영상처리장치의 제어 방법 | |
WO2013054957A1 (en) | Input interface controlling apparatus and method thereof | |
WO2012133980A1 (ko) | 영상처리장치 및 영상처리방법 | |
WO2013035952A1 (en) | Mobile terminal, image display device mounted on vehicle and data processing method using the same | |
WO2017034287A1 (en) | Pedestrial crash prevention system and operation method thereof | |
WO2011136456A1 (en) | Video display apparatus and method | |
WO2013027908A1 (en) | Mobile terminal, image display device mounted on vehicle and data processing method using the same | |
WO2014010879A1 (ko) | 음성 인식 장치 및 그 방법 | |
WO2014137074A1 (en) | Mobile terminal and method of controlling the mobile terminal | |
WO2011093560A1 (en) | Information display apparatus and method thereof | |
WO2014189200A1 (ko) | 영상표시장치 및 영상표시장치의 동작방법 | |
WO2012093784A2 (en) | Information display device and method for the same | |
WO2013133464A1 (en) | Image display device and method thereof | |
WO2018026059A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2012020863A1 (ko) | 이동단말기, 디스플레이 장치 및 그 제어 방법 | |
WO2012036323A1 (ko) | 통신 단말기 및 그 제어 방법 | |
WO2012133981A1 (ko) | 영상표시장치 및 그 영상표시장치의 동작 방법 | |
WO2015178574A1 (en) | Information providing system and method thereof | |
WO2016076474A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2016010262A1 (en) | Mobile terminal and controlling method thereof | |
WO2020162709A1 (en) | Electronic device for providing graphic data based on voice and operating method thereof | |
WO2018030646A1 (en) | Display device and method of controlling therefor | |
WO2012046891A1 (ko) | 이동단말기, 디스플레이 장치 및 그 제어 방법 | |
WO2017039103A1 (ko) | 이동 단말기 및 그의 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11862239 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2011862239 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011862239 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20137025867 Country of ref document: KR Kind code of ref document: A |