[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR20130093364A - Image display apparatus, and method for operating the same - Google Patents

Image display apparatus, and method for operating the same Download PDF

Info

Publication number
KR20130093364A
KR20130093364A KR1020120014917A KR20120014917A KR20130093364A KR 20130093364 A KR20130093364 A KR 20130093364A KR 1020120014917 A KR1020120014917 A KR 1020120014917A KR 20120014917 A KR20120014917 A KR 20120014917A KR 20130093364 A KR20130093364 A KR 20130093364A
Authority
KR
South Korea
Prior art keywords
image
displaying
template
display
external device
Prior art date
Application number
KR1020120014917A
Other languages
Korean (ko)
Inventor
김운영
이강섭
이형남
이건식
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120014917A priority Critical patent/KR20130093364A/en
Publication of KR20130093364A publication Critical patent/KR20130093364A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The present invention relates to an image display apparatus and an operation method thereof. According to an embodiment of the present invention, a method of operating an image display apparatus includes displaying a template for generating 3D content, and when an image for image insertion is selected, inserting the selected image into an object into which an image can be inserted in the template. And inserting and displaying the 3D content including the inserted image as a 3D image when there is a 3D content view input. This makes it possible to improve user convenience.

Description

[0001] The present invention relates to an image display apparatus and a method of operating the same,

The present invention relates to an image display apparatus and an operation method thereof, and more particularly, to an image display apparatus and an operation method thereof that can improve user convenience.

A video display device is a device having a function of displaying an image that a user can view. The user can view the broadcast through the video display device. A video display device displays a broadcast selected by a user among broadcast signals transmitted from a broadcast station on a display. Currently, broadcasting is changing from analog broadcasting to digital broadcasting around the world.

Digital broadcasting refers to broadcasting in which digital video and audio signals are transmitted. Digital broadcasting is more resistant to external noise than analog broadcasting, so it has less data loss, is advantageous for error correction, has a higher resolution, and provides a clearer picture. Also, unlike analog broadcasting, digital broadcasting is capable of bidirectional service.

SUMMARY OF THE INVENTION An object of the present invention is to provide an image display apparatus and an operation method thereof which can improve user convenience.

In addition, another object of the present invention is to provide an image display apparatus, and an operation method thereof that can easily perform 3D content production.

In addition, another object of the present invention is to provide an image display device and an operation method thereof that can easily transmit 3D content to an external device.

Another object of the present invention is to provide an image display apparatus capable of simply converting a 2D image into a 3D image, and an operation method thereof.

According to an aspect of the present invention, there is provided a method of operating an image display apparatus, including displaying a template for generating 3D content, and when an image for image insertion is selected, inserting an image into a template is possible. And inserting and displaying the selected image on the object, and displaying 3D content including the inserted image as a 3D image when there is a 3D content view input.

In addition, the operation method of the image display apparatus according to an embodiment of the present invention for achieving the above object, the step of displaying a transmission object for transmitting the 3D image to the external device, and when the transmission object is selected, the external transmission possible Displaying a transmission target list including the device; and transmitting a 3D image or an image in the 3D image to a corresponding external device when a predetermined external device in the transmission target list is selected.

In addition, the operation method of the image display apparatus according to an embodiment of the present invention for achieving the above object, the step of displaying a 2D image, the conversion object for converting the 2D image to a 3D image, and the conversion object When selected, converting the 2D image into a 3D image and displaying the converted 3D image.

In addition, the image display apparatus according to an embodiment of the present invention for achieving the above object, a display for displaying a template for generating 3D content, and when an image for image insertion is selected, in the template, an image insertable object And a control unit for controlling to insert and display the selected image, wherein the control unit controls to display 3D content including the inserted image as a 3D image when there is a 3D content view input.

In addition, the image display apparatus according to an embodiment of the present invention for achieving the above object includes a display for displaying a transmission object for transmitting the 3D image to the external device, and, when the transmission object is selected, the external device capable of transmission And a control unit which controls to display a transmission target list, wherein the control unit transmits a 3D image or an image in the 3D image to the corresponding external apparatus when a predetermined external device in the transmission target list is selected.

In addition, the image display apparatus according to an embodiment of the present invention for achieving the above object includes a control unit for controlling to display a display for displaying a 2D image, and a conversion object for converting the 2D image into a 3D image, The controller controls to convert the 2D video into the 3D video when selecting the transform object.

According to an embodiment of the present invention, by inserting an image in a template using a template for generating 3D content, it is possible to simply generate 3D content. In addition, by displaying the generated 3D content in a 3D image, by displaying the generated 3D content in a 3D image, the user's ease of use can be increased.

On the other hand, by displaying the transmission object for transmitting the 3D content in the image display device to the external device, it is possible to simply transmit the selected 3D image or the image in the 3D image to the external device.

At this time, by displaying the 3D image or the object representing the image in the 3D image to be transmitted, it is possible to simply grasp the content.

Meanwhile, another object of the present invention is to provide an image display device capable of easily transmitting 3D content to an external device, and an operation method thereof.

In addition, by displaying a switching object for converting the displayed 2D image into a 3D image, 3D switching can be easily performed.

1 is a view showing an appearance of a video display device of the present invention.
2 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.
3A and 3B are internal block diagrams of a set-top box according to an embodiment of the present invention.
FIG. 4 is an internal block diagram of the control unit of FIG. 2. FIG.
5 is a diagram illustrating various formats of a 3D image.
6 is a diagram illustrating an operation of a viewing apparatus according to the format of FIG. 5.
7 is a diagram illustrating various scaling methods of 3D video signals according to an embodiment of the present invention.
FIG. 8 is a diagram illustrating an image formed by a left eye image and a right eye image.
9 is a diagram illustrating a depth of a 3D image according to a distance between a left eye image and a right eye image.
FIG. 10 is a diagram illustrating a control method of the remote controller of FIG. 2.
11 is an internal block diagram of the remote control device of FIG. 2.
12 is a flowchart illustrating an operation method of an image display apparatus according to an embodiment of the present invention.
13 to 18C are views referred to for describing various examples of an operation method of the image display apparatus of FIG. 12.
19 is a flowchart illustrating a method of operating an image display apparatus according to another embodiment of the present invention.
20A to 22B are views referred to for describing various examples of an operating method of the image display apparatus of FIG. 19.
23 is a flowchart illustrating a method of operating an image display device according to another exemplary embodiment.
24A to 25B are views referred to for describing various examples of an operating method of the image display apparatus of FIG. 23.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The suffix "module" and " part "for components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Accordingly, the terms "module" and "part" may be used interchangeably.

1 is a view showing an appearance of a video display device of the present invention.

Referring to FIG. 1, an image display apparatus 100 according to an embodiment of the present invention may be a fixed image display apparatus or a mobile image display apparatus.

According to the embodiment of the present invention, the image display apparatus 100 can perform signal processing of a 3D image. For example, when the 3D image input to the image display apparatus 100 includes a plurality of viewpoint images, the left eye image and the right eye image are signal processed, and the left eye image and the right eye image are arranged according to the format of FIG. 5. The 3D video can be displayed according to the format.

Meanwhile, the video display device 100 described in the present specification may include a TV receiver, a monitor, a projector, a notebook computer, a digital broadcasting terminal, and the like.

2 is an internal block diagram of an image display apparatus according to an embodiment of the present invention.

2, an image display apparatus 100 according to an exemplary embodiment of the present invention includes a broadcast receiving unit 105, an external device interface unit 130, a storage unit 140, a user input interface unit 150, (Not shown), a control unit 170, a display 180, an audio output unit 185, and a viewing device 195.

The broadcast receiving unit 105 may include a tuner unit 110, a demodulation unit 120, and a network interface unit 130. Of course, it is possible to design the network interface unit 130 not to include the tuner unit 110 and the demodulation unit 120 as necessary, and to provide the network interface unit 130 with the tuner unit 110 And the demodulation unit 120 are not included.

The tuner unit 110 selects an RF broadcast signal corresponding to a channel selected by the user or all pre-stored channels among RF (Radio Frequency) broadcast signals received through the antenna 50. Also, the selected RF broadcast signal is converted into an intermediate frequency signal, a baseband image, or a voice signal.

For example, if the selected RF broadcast signal is a digital broadcast signal, it is converted into a digital IF signal (DIF). If the selected RF broadcast signal is an analog broadcast signal, it is converted into an analog baseband image or voice signal (CVBS / SIF). That is, the tuner unit 110 can process a digital broadcast signal or an analog broadcast signal. The analog baseband video or audio signal (CVBS / SIF) output from the tuner unit 110 can be directly input to the controller 170.

The tuner unit 110 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

Meanwhile, the tuner unit 110 sequentially selects RF broadcast signals of all broadcast channels stored through a channel memory function among the RF broadcast signals received through the antenna in the present invention, and sequentially selects RF broadcast signals of the intermediate frequency signal, baseband image, . ≪ / RTI >

On the other hand, the tuner unit 110 may be provided with a plurality of tuners in order to receive broadcast signals of a plurality of channels. Alternatively, a single tuner that simultaneously receives broadcast signals of a plurality of channels is also possible.

The demodulator 120 receives the digital IF signal DIF converted by the tuner 110 and performs a demodulation operation.

The demodulation unit 120 may perform demodulation and channel decoding, and then output a stream signal TS. In this case, the stream signal may be a signal multiplexed with a video signal, an audio signal, or a data signal.

The stream signal output from the demodulator 120 may be input to the controller 170. The control unit 170 performs demultiplexing, video / audio signal processing, and the like, and then outputs an image to the display 180 and outputs audio to the audio output unit 185.

The external device interface unit 130 can transmit or receive data with the connected external device 190. [ To this end, the external device interface unit 130 may include an A / V input / output unit (not shown) or a wireless communication unit (not shown).

The external device interface unit 130 can be connected to an external device such as a DVD (Digital Versatile Disk), a Blu ray, a game device, a camera, a camcorder, a computer , And may perform an input / output operation with an external device.

The A / V input / output unit can receive video and audio signals from an external device. Meanwhile, the wireless communication unit can perform short-range wireless communication with other electronic devices.

The network interface unit 135 provides an interface for connecting the video display device 100 to a wired / wireless network including the Internet network. For example, the network interface unit 135 can receive, via the network, content or data provided by the Internet or a content provider or a network operator.

The storage unit 140 may store a program for each signal processing and control in the control unit 170 or may store the processed video, audio, or data signals.

In addition, the storage unit 140 may perform a function for temporarily storing video, audio, or data signals input to the external device interface unit 130. [ In addition, the storage unit 140 may store information on a predetermined broadcast channel through a channel memory function such as a channel map.

Although the storage unit 140 of FIG. 2 is provided separately from the control unit 170, the scope of the present invention is not limited thereto. The storage unit 140 may be included in the controller 170.

The user input interface unit 150 transmits a signal input by the user to the control unit 170 or a signal from the control unit 170 to the user.

(Not shown), such as a power key, a channel key, a volume key, and a set value, from the remote control apparatus 200, (Not shown) that senses a user's gesture to the control unit 170 or transmits a signal from the control unit 170 to the control unit 170 It is possible to transmit it to the sensor unit (not shown).

The control unit 170 demultiplexes the input stream or processes the demultiplexed signals through the tuner unit 110 or the demodulation unit 120 or the external device interface unit 130 so as to output the video or audio output Signals can be generated and output.

The video signal processed by the controller 170 may be input to the display 180 and displayed as an image corresponding to the video signal. Also, the image signal processed by the controller 170 may be input to the external output device through the external device interface unit 130.

The audio signal processed by the control unit 170 may be output to the audio output unit 185 as an audio signal. The audio signal processed by the controller 170 may be input to the external output device through the external device interface unit 130. [

Although not shown in FIG. 2, the controller 170 may include a demultiplexer, an image processor, and the like. This will be described later with reference to Fig.

In addition, the control unit 170 can control the overall operation in the video display device 100. [ For example, the control unit 170 may control the tuner unit 110 to control the tuning of the RF broadcast corresponding to the channel selected by the user or the previously stored channel.

In addition, the controller 170 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150.

Meanwhile, the control unit 170 may control the display 180 to display an image. In this case, the image displayed on the display 180 may be a still image or a video, and may be a 2D image or a 3D image.

Meanwhile, the controller 170 may generate a 3D object for a predetermined 2D object among the images displayed on the display 180, and display the 3D object. For example, the object may be at least one of a connected web screen (newspaper, magazine, etc.), EPG (Electronic Program Guide), various menus, widgets, icons, still images, moving images, and text.

Such a 3D object may be processed to have a different depth than the image displayed on the display 180. [ Preferably, the 3D object may be processed to appear protruding from the image displayed on the display 180.

On the other hand, the control unit 170 can recognize the position of the user based on the image photographed from the photographing unit (not shown). For example, the distance (z-axis coordinate) between the user and the image display apparatus 100 can be grasped. In addition, the x-axis coordinate and the y-axis coordinate in the display 180 corresponding to the user position can be grasped.

Although not shown in the drawing, a channel browsing processing unit for generating a channel signal or a thumbnail image corresponding to an external input signal may be further provided. The channel browsing processing unit receives the stream signal TS output from the demodulation unit 120 or the stream signal output from the external device interface unit 130 and extracts an image from an input stream signal to generate a thumbnail image . The generated thumbnail image may be stream-decoded together with a decoded image and input to the controller 170. The control unit 170 may display a thumbnail list having a plurality of thumbnail images on the display 180 using the input thumbnail image.

At this time, the thumbnail list may be displayed in a simple view mode displayed on a partial area in a state where a predetermined image is displayed on the display 180, or in a full viewing mode displayed in most areas of the display 180. The thumbnail images in the thumbnail list can be sequentially updated.

The display 180 converts an image signal, a data signal, an OSD signal, a control signal, or an image signal, a data signal, a control signal received from the external device interface unit 130 processed by the controller 170, and generates a driving signal. Create

The display 180 may be a PDP, an LCD, an OLED, a flexible display, or the like, and may also be capable of a 3D display.

In order to view the three-dimensional image, the display 180 may be divided into an additional display method and a single display method.

The single display method can implement a 3D image only on the display 180 without a separate additional display, for example, glass, and examples thereof include a lenticular method, a parallax barrier, and the like Various methods can be applied.

In addition, the additional display method can implement a 3D image using an additional display as the viewing device 195 in addition to the display 180. For example, various methods such as a head mount display (HMD) type and a glasses type are applied .

On the other hand, the spectacle type may be divided into a passive system such as a polarized glasses type and an active system such as a shutter glass type. In addition, the head mounted display can be divided into a passive type and an active type.

On the other hand, the viewing apparatus 195 may be a 3D glass for stereoscopic viewing. The glass 195 for 3D may include a passive polarizing glass or an active shutter glass, and may be a concept including the head mount type described above.

For example, when the viewing apparatus 195 is a polarizing glass, the left eye glass can be realized as a left eye polarizing glass, and the right eye glass can be realized as a right eye polarizing glass.

As another example, when the viewing apparatus 195 is a shutter glass, the left eye glass and the right eye glass can be alternately opened and closed.

Meanwhile, the display 180 may be configured as a touch screen and used as an input device in addition to the output device.

The audio output unit 185 receives the signal processed by the control unit 170 and outputs it as a voice.

A photographing unit (not shown) photographs the user. The photographing unit (not shown) may be implemented by a single camera, but the present invention is not limited thereto, and may be implemented by a plurality of cameras. On the other hand, the photographing unit (not shown) may be embedded in the image display device 100 on the upper side of the display 180 or may be disposed separately. The image information photographed by the photographing unit (not shown) may be input to the control unit 170.

The control unit 170 can detect the gesture of the user based on each of the images photographed from the photographing unit (not shown) or the signals sensed from the sensor unit (not shown) or a combination thereof.

The remote control apparatus 200 transmits the user input to the user input interface unit 150. [ To this end, the remote control apparatus 200 can use Bluetooth, RF (radio frequency) communication, infrared (IR) communication, UWB (Ultra Wideband), ZigBee, or the like. Also, the remote control apparatus 200 can receive the video, audio, or data signal output from the user input interface unit 150 and display it or output it by the remote control apparatus 200.

Meanwhile, a block diagram of the image display device 100 shown in FIG. 2 is a block diagram for an embodiment of the present invention. Each component of the block diagram may be integrated, added, or omitted according to the specifications of the image display apparatus 100 actually implemented. That is, two or more constituent elements may be combined into one constituent element, or one constituent element may be constituted by two or more constituent elements, if necessary. In addition, the functions performed in each block are intended to illustrate the embodiments of the present invention, and the specific operations and apparatuses do not limit the scope of the present invention.

2, the video display apparatus 100 does not include the tuner unit 110 and the demodulation unit 120 shown in FIG. 2, but may be connected to the network interface unit 130 or the external device interface unit 135 to play back the video content.

On the other hand, the image display apparatus 100 is an example of a video signal processing apparatus that performs signal processing of an image stored in the apparatus or an input image. Another example of the image signal processing apparatus includes a display 180 shown in FIG. 2, A set-top box excluding the audio output unit 185, a DVD player, a Blu-ray player, a game machine, a computer, and the like may be further exemplified. The set-top box will be described with reference to FIGS. 3A to 3B below.

3A and 3B are internal block diagrams of a set-top box according to an embodiment of the present invention.

3A, the set-top box 250 includes a broadcast receiving unit 272, a network interface unit 255, a storage unit 258, a signal processing unit 260, a user input interface unit 263, And a device interface unit 265.

The broadcast receiving unit 272 may include a tuner unit 270 and a demodulation unit 275. In particular, it can receive a broadcast signal, which is received via the antenna 50. The received broadcast signal can be input to the signal processing unit 260.

The network interface unit 255 provides an interface for connecting to a wired / wireless network including an internet network. It can also transmit or receive data to other users or other electronic devices via the connected network or other network linked to the connected network.

The storage unit 258 may store a program for processing and controlling signals in the signal processing unit 260, and may include an image, audio, or data input from the external device interface unit 265 or the network interface unit 255. It may also serve as a temporary storage of the signal.

The signal processor 260 performs signal processing on the input signal. For example, it is possible to demultiplex or decode an input video signal, and perform demultiplexing or decoding of an input audio signal. To this end, a video decoder or a voice decoder may be provided. The processed video signal or audio signal can be transmitted to the video display device 100 through the external device interface unit 265.

In particular, according to an embodiment of the present invention, the signal processor 260 may signal-process the received 3D broadcast signal. In detail, the signal processor 260 may receive a 3D broadcast signal that cannot be processed by the image display apparatus 100 and perform signal processing on the 3D broadcast signal.

For example, when the input 3D broadcast signal is a dual stream encoded in MPEG-2 and MPEG-4 (H.264), respectively, the signal processor 260 receives the input 3D broadcast signal, After demultiplexing, MPEG-2 decoding, and MPEG-4 (H.264) decoding, the image may be separated into a left eye image and a right eye image.

As another example, when the input 3D broadcast signal is a 3D image format, which is not illustrated in FIG. 5, the format may be signal-processed and separated into a left eye image and a right eye image.

The 3D video signal (left eye video signal and right eye video signal) signal-processed by the signal processor 260 is, through the external device interface unit 265, the image display device 100, in particular, the external device interface unit 130. Can be sent to.

Meanwhile, the signal processor 260 may generate an object that informs the user to enter the external input display mode and control the object to be transmitted to the image display apparatus 100.

The signal processor 260 may generate a 3D image setting menu and control the 3D image setting menu to be transmitted to the image display apparatus 100.

The user input interface unit 263 transmits a signal input by the user to the signal processor 260 or transmits a signal from the signal processor 260 to the user. For example, various control signals, such as power on / off, operation input, setting input, etc., which are input through a local key (not shown) or the remote control apparatus 200, may be received and transmitted to the signal processor 260.

The external device interface unit 265 provides an interface for data transmission or reception with an external device connected by wire or wirelessly. In particular, it provides an interface for data transmission or reception with the video display device 100. It is also possible to provide an interface for data transmission or reception with an external device such as a game device, a camera, a camcorder, a computer (notebook computer) or the like.

The set top box 250 may further include a media input unit (not shown) for reproducing a separate media. An example of such a media input unit is a Blu-ray input unit (not shown) or the like. That is, the set-top box 250 can include a Blu-ray player or the like. After the signal processing such as demultiplexing or decoding in the signal processing unit 260, the input media such as a Blu-ray disc can be transmitted to the video display device 100 through the external device interface unit 265 for display have.

Referring to FIG. 3B, the set-top box 250 is similar to the set-top box 250 of FIG. 3A except that there is no separate broadcast receiver 272, unlike FIG. 3A.

That is, the set top box 250 does not include a separate broadcast receiver 272, and when there is a 3D broadcast signal that cannot be processed by the video display device 100, the external device interface unit of the video display device 100 ( The 3D broadcast signal transmitted from 130 may be received by the external device interface unit 265.

The 3D broadcast signal received by the external device interface unit 265 may be input to the signal processor 260.

Operations of the signal processing unit 260 and the like are not described with reference to FIG. 3A.

4 is an internal block diagram of the controller of FIG. 2, FIG. 5 is a diagram illustrating various formats of a 3D image, and FIG. 6 is a diagram illustrating an operation of a viewing apparatus according to the format of FIG. 5.

The control unit 170 includes a demultiplexing unit 310, an image processing unit 320, a processor 330, an OSD generating unit 340, a mixer 345, A frame rate conversion unit 350, and a formatter 360. [0031] An audio processing unit (not shown), and a data processing unit (not shown).

The demultiplexer 310 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively. The stream signal input to the demultiplexer 310 may be a stream signal output from the tuner 110 or the demodulator 120 or the external device interface 130.

The image processor 320 may perform image processing of the demultiplexed image signal. For this, the image processing unit 320 may include a video decoder 325 and a scaler 335.

The video decoder 325 decodes the demultiplexed video signal and the scaler 335 performs scaling so that the resolution of the decoded video signal can be output from the display 180.

The video decoder 325 can include a decoder of various standards.

On the other hand, the image signal decoded by the image processing unit 320 can be divided into a case where there is only a 2D image signal, a case where a 2D image signal and a 3D image signal are mixed, and a case where there is only a 3D image signal.

For example, when an external video signal input from the external device 190 or a broadcast video signal of a broadcast signal received from the tuner unit 110 includes only a 2D video signal, when a 2D video signal and a 3D video signal are mixed And a case where there is only a 3D video signal. Accordingly, the controller 170, particularly, the image processing unit 320 and the like can process the 2D video signal, the mixed video signal of the 2D video signal and the 3D video signal, , A 3D video signal can be output.

Meanwhile, the image signal decoded by the image processing unit 320 may be a 3D image signal in various formats. For example, a 3D image signal composed of a color image and a depth image, or a 3D image signal composed of a plurality of view image signals. The plurality of viewpoint image signals may include, for example, a left eye image signal and a right eye image signal.

Here, the format of the 3D video signal is a side by side format (FIG. 5A) in which the left eye video signal L and the right eye video signal R are arranged left and right, as shown in FIG. 5, and up and down. Top / Down format (FIG. 5B) to arrange, Frame Sequential format (FIG. 5C) to arrange by time division, Interlaced format which mixes the left eye signal and the right eye video signal by line 5d), a checker box format (FIG. 5E) for mixing the left eye image signal and the right eye image signal for each box.

The processor 330 may control the overall operation in the image display apparatus 100 or in the control unit 170. [ For example, the processor 330 may control the tuner 110 to select a channel selected by the user or an RF broadcast corresponding to a previously stored channel.

In addition, the processor 330 may control the image display apparatus 100 by a user command or an internal program input through the user input interface unit 150. [

In addition, the processor 330 may perform data transfer control with the network interface unit 135 or the external device interface unit 130.

The processor 330 may control operations of the demultiplexing unit 310, the image processing unit 320, the OSD generating unit 340, and the like in the controller 170.

The OSD generator 340 generates an OSD signal according to a user input or itself. For example, based on a user input signal, a signal for displaying various information in a graphic or text form on the screen of the display 180 can be generated. The generated OSD signal may include various data such as a user interface screen of the video display device 100, various menu screens, a widget, and an icon. In addition, the generated OSD signal may include a 2D object or a 3D object.

The OSD generating unit 340 can generate a pointer that can be displayed on the display based on the pointing signal input from the remote control device 200. [ In particular, such a pointer may be generated by the pointing signal processor, and the OSD generator 240 may include such a pointing signal processor (not shown). Of course, a pointing signal processing unit (not shown) may be provided separately from the OSD generating unit 240.

The mixer 345 may mix the OSD signal generated by the OSD generator 340 and the decoded video signal processed by the image processor 320. At this time, the OSD signal and the decoded video signal may include at least one of a 2D signal and a 3D signal. The mixed video signal is supplied to a frame rate converter 350.

A frame rate converter (FRC) 350 can convert the frame rate of an input image. On the other hand, the frame rate converter 350 can output the frame rate without conversion.

The formatter 360 may arrange the left eye image frame and the right eye image frame of the frame rate-converted 3D image. The left eye glass of the 3D viewing apparatus 195 and the synchronization signal Vsync for opening the right eye glass can be output.

The formatter 360 receives the mixed signal, i.e., the OSD signal and the decoded video signal, from the mixer 345, and separates the 2D video signal and the 3D video signal.

In the present specification, a 3D video signal means a 3D object. Examples of the 3D object include a picuture in picture (PIP) image (still image or moving picture), an EPG indicating broadcasting program information, Icons, texts, objects in images, people, backgrounds, web screens (newspapers, magazines, etc.).

On the other hand, the formatter 360 can change the format of the 3D video signal. For example, it may be changed to any of the various formats illustrated in FIG. Accordingly, according to the format, as shown in FIG. 6, an operation of the viewing apparatus of the glasses type may be performed.

First, FIG. 6A illustrates an operation of the 3D glasses 195, in particular the shutter glass 195, when the formatter 360 arranges and outputs the frame sequential format among the formats of FIG. 5.

That is, when the left eye image L is displayed on the display 180, the left eye glass of the shutter glass 195 is opened and the right eye glass is closed. When the right eye image R is displayed, The left eye glass is closed, and the right eye glass is opened.

6B illustrates an operation of the 3D glass 195, particularly the polarized glass 195, when the formatter 360 outputs the side by side format of the format of FIG. 5. Meanwhile, the 3D glass 195 applied in FIG. 6B may be a shutter glass, and the shutter glass may be operated as a polarized glass by keeping both the left eye glass and the right eye glass open. .

Meanwhile, the formatter 360 may convert the 2D video signal into a 3D video signal. For example, according to a 3D image generation algorithm, an edge or a selectable object is detected in a 2D image signal, and an object or a selectable object according to the detected edge is separated into a 3D image signal and is generated . At this time, the generated 3D image signal can be separated into the left eye image signal L and the right eye image signal R, as described above.

Although not shown in the drawing, it is also possible that a 3D processor (not shown) for 3-dimensional effect signal processing is further disposed after the formatter 360. The 3D processor (not shown) can process the brightness, tint, and color of the image signal to improve the 3D effect. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. On the other hand, the functions of such a 3D processor can be merged into the formatter 360 or merged into the image processing unit 320. [ This will be described later with reference to FIG. 7 and the like.

Meanwhile, the audio processing unit (not shown) in the control unit 170 can perform the audio processing of the demultiplexed audio signal. To this end, the audio processing unit (not shown) may include various decoders.

In addition, the audio processing unit (not shown) in the control unit 170 can process a base, a treble, a volume control, and the like.

The data processing unit (not shown) in the control unit 170 can perform data processing of the demultiplexed data signal. For example, if the demultiplexed data signal is a coded data signal, it can be decoded. The encoded data signal may be EPG (Electronic Program Guide) information including broadcast information such as a start time and an end time of a broadcast program broadcasted on each channel.

4 shows that the signals from the OSD generating unit 340 and the image processing unit 320 are mixed in the mixer 345 and then 3D processed in the formatter 360. However, May be located behind the formatter. That is, the output of the image processing unit 320 is 3D-processed by the formatter 360, and the OSD generating unit 340 performs 3D processing together with the OSD generation. Thereafter, the processed 3D signals are mixed by the mixer 345 It is also possible to do.

Meanwhile, the block diagram of the controller 170 shown in FIG. 4 is a block diagram for an embodiment of the present invention. Each component of the block diagram can be integrated, added, or omitted according to the specifications of the control unit 170 actually implemented.

In particular, the frame rate converter 350 and the formatter 360 are not provided in the controller 170, but may be separately provided.

7 is a diagram illustrating various scaling methods of 3D video signals according to an embodiment of the present invention.

Referring to the drawing, in order to increase the 3-dimensional effect, the controller 170 may perform 3D effect signal processing. Among them, in particular, the size or tilt of the 3D object in the 3D image may be adjusted.

As shown in FIG. 7A, the 3D image signal or the 3D object 510 in the 3D image signal may be enlarged or reduced 512 as a whole at a predetermined ratio, and as shown in FIGS. 7B and 7C. The 3D object may be partially enlarged or reduced (trapezoidal shapes 514 and 516). In addition, as shown in FIG. 7D, at least a part of the 3D object may be rotated (parallel quadrilateral shape) 518. This scaling (scaling) or skew adjustment can emphasize the 3D effect of a 3D object in a 3D image or a 3D image, that is, a 3D effect.

On the other hand, as the slope becomes larger, as shown in FIG. 7 (b) or 7 (c), the length difference between the parallel sides of the trapezoidal shapes 514 and 516 increases, or as shown in FIG. It gets bigger.

The size adjustment or the tilt adjustment may be performed after the 3D image signal is aligned in a predetermined format in the formatter 360. [ Or in the scaler 335 in the image processing unit 320. [ On the other hand, the OSD generation unit 340, it is also possible to create the object in the shape as shown in Figure 7 to generate the OSD to emphasize the 3D effect.

On the other hand, although not shown in the figure, as a signal processing for a three-dimensional effect, in addition to the size adjustment or tilt adjustment illustrated in FIG. 7, the brightness, tint and It is also possible to perform signal processing such as color adjustment. For example, it is possible to perform signal processing such as making the near field clear and the far field blurring. The signal processing for the 3D effect may be performed in the controller 170 or may be performed through a separate 3D processor. Particularly, when it is performed in the control unit 170, it is possible to perform it in the formatter 360 or in the image processing unit 320 together with the above-described size adjustment or tilt adjustment.

FIG. 8 is a diagram illustrating image formation by a left eye image and a right eye image, and FIG. 9 is a diagram illustrating a depth of a 3D image according to a gap between a left eye image and a right eye image.

First, referring to FIG. 8, a plurality of images or a plurality of objects 615, 625, 635, and 645 are illustrated.

First, the first object 615 includes a first left eye image 611 (L) based on the first left eye image signal and a first right eye image 613 (R) based on the first right eye image signal, It is exemplified that the interval between the first left eye image 611, L and the first right eye image 613, R is d1 on the display 180. [ At this time, the user recognizes that an image is formed at an intersection of an extension line connecting the left eye 601 and the first left eye image 611 and an extension line connecting the right eye 603 and the first right eye image 603. Accordingly, the user recognizes that the first object 615 is positioned behind the display 180. [

Next, since the second object 625 includes the second left eye image 621, L and the second right eye image 623, R and overlaps with each other and is displayed on the display 180, do. Accordingly, the user recognizes that the second object 625 is located on the display 180. [

Next, the third object 635 and the fourth object 645 are arranged in the order of the third left eye image 631, L, the second right eye image 633, R, the fourth left eye image 641, Right eye image 643 (R), and their intervals are d3 and d4, respectively.

According to the above-described method, the user recognizes that the third object 635 and the fourth object 645 are positioned at positions where the images are formed, respectively, and recognizes that they are located before the display 180 in the drawing.

At this time, it is recognized that the fourth object 645 is projected before the third object 635, that is, more protruded than the third object 635. This is because the interval between the fourth left eye image 641, L and the fourth right eye image 643, d4 is larger than the interval d3 between the third left eye image 631, L and the third right eye image 633, R. [

Meanwhile, in the embodiment of the present invention, the distance between the display 180 and the objects 615, 625, 635, and 645 recognized by the user is represented by a depth. Accordingly, it is assumed that the depth when the user is recognized as being positioned behind the display 180 has a negative value (-), and the depth when the user is recognized as being positioned before the display 180 (depth) has a negative value (+). That is, the greater the degree of protrusion in the user direction, the greater the depth.

9, the distance a between the left eye image 701 and the right eye image 702 of FIG. 9A is a distance between the left eye image 701 and the right eye image 702 shown in FIG. 9B. When (b) is smaller, it can be seen that the depth a 'of the 3D object of FIG. 9 (a) is smaller than the depth b' of the 3D object of FIG. 9 (b).

In this way, when the 3D image is exemplified as the left eye image and the right eye image, the positions recognized as images are different depending on the interval between the left eye image and the right eye image. Accordingly, by adjusting the display intervals of the left eye image and the right eye image, the depth of the 3D image or the 3D object composed of the left eye image and the right eye image can be adjusted.

FIG. 10 is a diagram illustrating a control method of the remote controller of FIG. 2.

As illustrated in FIG. 10A, a pointer 205 corresponding to the remote controller 200 is displayed on the display 180.

The user can move or rotate the remote control device 200 up and down, left and right (FIG. 10 (b)), front and rear (FIG. 10 (c)). The pointer 205 displayed on the display 180 of the video display device corresponds to the movement of the remote control device 200. [ As shown in the figure, the remote controller 200 can be referred to as a space remote controller or a 3D pointing device because the pointer 205 is moved and displayed according to the movement in the 3D space.

FIG. 10B illustrates that when the user moves the remote control apparatus 200 to the left side, the pointer 205 displayed on the display 180 of the image display apparatus also moves to the left side correspondingly.

Information on the motion of the remote control device 200 sensed through the sensor of the remote control device 200 is transmitted to the image display device. The image display apparatus can calculate the coordinates of the pointer 205 from the information on the motion of the remote control apparatus 200. [ The image display apparatus can display the pointer 205 so as to correspond to the calculated coordinates.

FIG. 10C illustrates a case in which the user moves the remote control apparatus 200 away from the display 180 while pressing a specific button in the remote control apparatus 200. Thereby, the selected area in the display 180 corresponding to the pointer 205 can be zoomed in and displayed. Conversely, when the user moves the remote control device 200 close to the display 180, the selection area within the display 180 corresponding to the pointer 205 may be zoomed out and zoomed out. On the other hand, when the remote control device 200 moves away from the display 180, the selection area is zoomed out, and when the remote control device 200 approaches the display 180, the selection area may be zoomed in.

On the other hand, when the specific button in the remote control device 200 is pressed, it is possible to exclude recognizing the up, down, left, and right movement. That is, when the remote control device 200 moves away from or near the display 180, the up, down, left and right movements are not recognized, and only the front and back movements can be recognized. Only the pointer 205 is moved in accordance with the upward, downward, leftward, and rightward movement of the remote control device 200 in a state where the specific button in the remote control device 200 is not pressed.

On the other hand, the moving speed and moving direction of the pointer 205 may correspond to the moving speed and moving direction of the remote control device 200.

11 is an internal block diagram of the remote control device of FIG. 2.

The remote control device 200 includes a wireless communication unit 825, a user input unit 835, a sensor unit 840, an output unit 850, a power supply unit 860, a storage unit 870, And a control unit 880.

The wireless communication unit 825 transmits / receives a signal to / from any one of the video display devices according to the above-described embodiments of the present invention. Of the video display devices according to the embodiments of the present invention, one video display device 100 will be described as an example.

In this embodiment, the remote control apparatus 200 may include an RF module 821 capable of transmitting and receiving signals with the image display apparatus 100 according to the RF communication standard. Also, the remote control apparatus 200 may include an IR module 823 capable of transmitting and receiving signals to and from the image display apparatus 100 according to the IR communication standard.

In this embodiment, the remote control device 200 transmits a signal containing information on the motion of the remote control device 200 to the image display device 100 through the RF module 821. [

In addition, the remote control apparatus 200 may receive a signal transmitted from the image display apparatus 100 through the RF module 821. In addition, the remote control apparatus 200 may transmit a command regarding power on / off, channel change, volume change, etc. to the image display apparatus 100 through the IR module 823 as necessary.

The user input unit 835 may include a keypad, a button, a touch pad, or a touch screen. The user can input a command related to the image display apparatus 100 to the remote control apparatus 200 by operating the user input unit 835. [ When the user input unit 835 has a hard key button, the user can input a command related to the image display device 100 to the remote control device 200 through the push operation of the hard key button. When the user input unit 835 includes a touch screen, the user may touch a soft key of the touch screen to input a command related to the image display apparatus 100 to the remote control apparatus 200. [ Also, the user input unit 835 may include various types of input means, such as a scroll key, a jog key, and the like, which can be operated by the user, and the present invention does not limit the scope of the present invention.

The sensor unit 840 may include a gyro sensor 841 or an acceleration sensor 843. The gyro sensor 841 can sense information about the motion of the remote control device 200. [

For example, the gyro sensor 841 can sense information about the operation of the remote control device 200 based on the x, y, and z axes. The acceleration sensor 843 can sense information on the moving speed of the remote control device 200 and the like. On the other hand, a distance measuring sensor can be further provided, whereby the distance to the display 180 can be sensed.

The output unit 850 may output an image or voice signal corresponding to the operation of the user input unit 835 or corresponding to the signal transmitted from the image display apparatus 100. [ The user can recognize whether the user input unit 835 is operated or whether the image display apparatus 100 is controlled through the output unit 850.

For example, the output unit 850 includes an LED module 851 that is turned on when a user input unit 835 is operated or a signal is transmitted / received to / from the video display device 100 through the wireless communication unit 825, a vibration module 853 for outputting sound, an acoustic output module 855 for outputting sound, or a display module 857 for outputting an image.

The power supply unit 860 supplies power to the remote control device 200. The power supply unit 860 may reduce power waste by stopping the power supply when the remote controller 200 does not move for a predetermined time. The power supply unit 860 may resume power supply when a predetermined key provided in the remote control device 200 is operated.

The storage unit 870 may store various types of programs, application data, and the like required for controlling or operating the remote control apparatus 200. If the remote control device 200 wirelessly transmits and receives a signal through the image display device 100 and the RF module 821, the remote control device 200 and the image display device 100 transmit signals through a predetermined frequency band Send and receive. The control unit 880 of the remote control device 200 stores information on a frequency band and the like capable of wirelessly transmitting and receiving signals with the video display device 100 paired with the remote control device 200 in the storage unit 870 Can be referenced.

The controller 880 controls various items related to the control of the remote controller 200. The control unit 880 transmits a signal corresponding to the predetermined key operation of the user input unit 835 or a signal corresponding to the motion of the remote control device 200 sensed by the sensor unit 840 through the wireless communication unit 825, (100).

12 is a flowchart illustrating a method of operating an image display apparatus according to an exemplary embodiment of the present invention, and FIGS. 13 to 18C are views for explaining various examples of the method of operating the image display apparatus of FIG. 12.

First, the image display apparatus 100 enters a 3D content generation mode (S1205). The controller 170 controls to enter the 3D content generation mode when there is a command to enter the 3D content generation mode based on the input of the remote control apparatus 200 or the local key (not shown).

For example, in a state in which a predetermined image is displayed on the display 180, when there is a confirmation key input, the controller 170 controls to enter the 3D content generation mode immediately, or, by inputting the confirmation key, When the object for entering the 3D content generation mode is selected while the object is displayed, the controller 170 may control to enter the 3D content generation mode.

Alternatively, when the 3D content generation application installed in the image display apparatus 100 is selected, the controller 170 may drive the 3D content generation application to directly enter the 3D content generation mode.

Next, a template for generating 3D content is displayed on the display 180 (S1210). When entering the 3D content generation mode, the controller 170 may control to display a template for generating 3D content.

Here, the template may be a template previously stored in the image display apparatus 100. Alternatively, the template may be displayed through the 3D content generation application. When the 3D content generation application is driven, the template may be previously stored in the image display apparatus 100 or may be a template received from an external server through the network interface unit 135.

Meanwhile, referring to FIG. 13A, the template 1300 may include a template for a left eye image 1310 and a template for a right eye image 1320. The left eye image template 1310 and the right eye image template 1320 respectively include the left eye objects 1330a, 1335a, 1340a, 1345a and the right eye objects 1330b, 1335b, 1340b, and 1345b, into which images can be inserted. ) May be provided.

The left eye objects 1330a, 1335a, 1340a, and 1345a and the right eye objects 1330b, 1335b, 1340b, and 1345b may have different positions of the corresponding objects when displaying a 3D image. That is, in displaying a 3D image, the left eye image template 1310 and the right eye image template 1320 have different parallaxes.

The template of FIG. 13A represents a template for producing 3D content, and is divided into a left eye image template 1310 and a right eye image template 1320, but before the 3D image display step, the left eye object ( There is no parallax between the 1330a, 1335a, 1340a, and 1345a and the right eye objects 1330b, 1335b, 1340b, and 1345b, and in the 3D image display step, each parallax may be provided.

13A illustrates a case in which the left eye image template 1310 and the right eye image template 1320 are in a top / down format.

Meanwhile, referring to FIG. 13B, the left eye objects 1330a, 1335a, 1340a, and 1345a and the right eye objects 1330b, 1335b, 1340b, and 1345b are respectively provided with position information (x, y coordinate information) of the corresponding object ( 1331a, 1336a, 1341a, 1346a, 1331b, 1336b, 1341b, and 1346b.

Referring to FIG. 13B, the x coordinates between the left eye objects 1330a, 1335a, 1340a, and 1345a and the right eye objects 1330b, 1335b, 1340b, and 1345b are partially different for 3D display, and the y coordinate is top down. Corresponding to the format, it can be seen that a constant difference is maintained.

Meanwhile, unlike FIG. 13A, the template 1300 may be expected in various ways such as a side by side format, a frame sequential format, an interlace format (FIG. 5D), and a checker box format.

On the other hand, unlike FIG. 13A, a single template may be used without distinguishing between a template for a left eye image and a template for a right eye image.

Next, it may be determined whether an object capable of inserting an image in the template is selected (S1215), and if applicable, an image list for inserting an image may be displayed corresponding to the selected object (S1220). When any one image in the image list is selected (S1225), the selected image may be inserted into the selected image and the inserted image may be displayed (S1230).

When one of the objects capable of inserting an image in the template is selected by the user input, the controller 170 connects to the image stored in the image display apparatus 100 or the image display apparatus 100 for image insertion. It is possible to control to display an image list including the image.

As shown in FIG. 14A, when the first object 1330a capable of inserting an image is selected by a user input such as the remote control device 200 or a local key while the template 1300 is displayed, as shown in FIG. 14B, The image list 140 may be displayed.

FIG. 14B illustrates that the image list 1400 is displayed near the first object 1330a in the shape of a speech balloon.

Meanwhile, unlike the other objects, the selected first object 1330a may be divided and displayed as a highlight or the like.

As shown in FIG. 14B, when the first image 1410 is selected by a user input such as a remote control device 200 or a local key in the image list while the image list 1400 is displayed, as shown in FIG. 14C. The image 1420a corresponding to the first image 1410 may be inserted and displayed at the position of the first object 1330a in the template 1300.

In this case, since the template 1300 includes the left eye image template 1310 and the right eye image template 1320, the template for the right eye image 1320 corresponding to the first object 1330a in the left eye image template 1310. Also, an image 1420b corresponding to the first image 1410 may be inserted and displayed in the object 1330b in FIG.

Meanwhile, as shown in FIG. 14D, in a state where the image 1420a corresponding to the first image 1410 is inserted at the position of the first object 1330a in the template 1300, the corresponding image 1420a according to a user input. For moving the coordinates of, the object 1430 may be further displayed.

In the drawing, although the coordinate moving object 1430 illustrates scroll bars 1432, 1434, and 1436 for moving the x, y, and z coordinates of the image 1420a, various modifications are possible.

For example, when the x coordinate is moved to the right using the first scroll bar 1432, the image 1420a may be moved to the right and displayed as shown in FIG. 14E. Accordingly, an image 1420a corresponding to the first image 1410 may be displayed by being spaced apart from the position of the first object 1330a by the right movement amount.

Meanwhile, the image 1420b corresponding to the first image 1410 inserted into the left eye image template 1310 according to the x-coordinate right shift input is located at the position of the object 1330b in the right eye image template 1320. It may be displayed spaced apart from the right moving amount from.

Meanwhile, the y coordinate movement, the z coordinate movement, and the z coordinate movement may be performed similarly. In particular, the z coordinate movement adjusts the disparity between the left eye image and the right eye image. For example, when the z coordinate of the first object 1330a is increased, the object corresponding to the first object 1330a is increased. The parallax between 1330b may increase. Accordingly, the depth may be increased when displaying 3D images in the future.

On the other hand, as described above, when the image insertion is completed in the other objects (1335a, 1340a, 1345a, 1335b, 1340b, 1345b) that can be inserted into the template, it is possible to simply complete the 3D content. Therefore, the user can easily generate 3D content.

FIG. 15A shows images 1430a, 1515a, 1525a, 1535a, which correspond to respective objects into which images can be inserted in the template 1510 for the left eye image and the template 1520 for the right eye image in the template 1500. The insertion of 1430b, 1515b, 1525b, and 1535b is illustrated.

Next, when the 3D content generation mode end input is received, the 3D content generation mode is terminated (S1235). The controller 170 may control to end the 3D content generation mode when the 3D content generation mode end input is received.

For example, as shown in FIG. 15A, in a state in which 3D content generation using a template is completed, a Nagaki button such as a remote controller or a local key is operated, or an object indicating exit displayed on the display 180 is selected. In this case, the display of the corresponding template may be terminated. That is, 3D content generation can be completed.

Next, when there is a 3D content view input (S1240), 3D content including the inserted image is displayed as a 3D image (S1245).

When there is a 3D content view input or a 3D content view object displayed on the display is selected by the remote control device or a local key, the controller 170 displays the 3D content or the image stored in the image display device 100. The 3D content in the external device connected to the device 100 may be controlled to list and display the 3D content.

When a specific 3D content is selected while the 3D content list is displayed, the controller 170 controls to display the 3D content as a 3D image.

For example, the controller 170 may generate a parallax between the left eye image and the right eye image of the 3D content for 3D display in a passive type, that is, a FPR (Flim-type Patterned Retarder) method, and accordingly, the 3D content The left eye image and the right eye image may be controlled to be displayed at the same time.

That is, as described above, in the template of FIG. 13A, the left eye image template 1310 and the right eye image template 1320 are simultaneously displayed before the 3D image display step, and in each of the 3D image display steps, The left eye image template 1310 and the right eye image template 1320 to which the corresponding parallax is given for each object may be simultaneously displayed. Accordingly, a user wearing polarized glass recognizes 3D content as a 3D image.

Alternatively, the controller 170 may generate a parallax between the left eye image and the right eye image of the 3D content having a parallax for 3D display in an active manner, that is, a shutter glass (SG) method. The left eye image and the right eye image of the 3D content may be sequentially displayed.

Accordingly, the user can easily watch the desired content as a 3D image.

 As shown in FIG. 15A, when there is a 3D content view input or a 3D content view object displayed on the display is selected by the remote controller or a local key, etc., when the 3D content generation is completed, as shown in FIG. The image 1500 is displayed. FIG. 15B illustrates 3D objects 1520, 1515, 1525, 1535, planetary by image insertion, with a parallax between the left eye image and the right eye image.

The 3D objects 1520, 1515, 1525, and 1535 may correspond to the objects (1420a, 1420b), 1515a, 1515b, 1525a, 1525b, and 1535a, 1535b of FIG. 15A.

Meanwhile, when there are a plurality of 3D objects in the 3D image, the 3D objects may be sequentially highlighted for each object, or may be displayed as 3D objects with depths sequentially set for each 3D object. That is, they can be displayed sequentially as a 3D slide.

FIG. 16A illustrates that the 3D object 1520 of the plurality of 3D objects is highlighted and displayed at the first time t1. FIG. 16B illustrates that the 3D object 1515 is displayed at the second time t2. 16C illustrates that the 3D object 1525 is highlighted and displayed at the third time t3, and FIG. 16D illustrates the 3D object 1535 at the fourth time t4. 되는 is highlighted.

The template for producing 3D content may be provided in various ways in the image display apparatus.

FIG. 17A illustrates a template 1700 including an image insertable template 1710 and a right eye image template 1720, and FIG. 17B illustrates an image insertable template 1760 for inserting an image. A template 1750 including a template 1770 for a right eye image is illustrated.

After entering the above-described 3D content generation mode (S1205), in the template display step S1210, instead of the template 1300 of FIG. 13A, the template of FIG. 17A or the template of FIG. 17B may be displayed.

On the other hand, after entering the 3D content generation mode (S1205), before displaying the template, a template list, which is pre-stored in the image display apparatus 100 or provided through an application for generating 3D content, may be displayed. In addition, the selected template may be displayed on the display 180 for 3D content generation.

18A illustrates a template list 1800 that includes first to third templates 1810, 1815, and 1820. Among these, when the first template 1810 is selected, the template 1300 is displayed as shown in FIG. 13A, and when the second template 1815 is selected, the second template 1830 is displayed as shown in FIG. 18B. When the third template 1820 is selected, the third template 1840 may be displayed as shown in FIG. 18C. As a result, 3D content can be produced using a template desired by a user.

19 is a flowchart illustrating a method of operating an image display device according to another exemplary embodiment. FIGS. 20A to 22B are diagrams for describing various examples of the method of operating the image display device of FIG. 19.

Referring to the drawing, 3D content is selected (S1905), and the selected 3D content is displayed (S1910).

The user is displayed on the display 180 of the image display apparatus 100 in a state where a 3D content list including 3D contents stored in the image display apparatus 100 or stored in an external device connected to the image display apparatus 100 is displayed. When the predetermined 3D content is selected by the input, the controller 170 may control to display the 3D content as a 3D image.

20A illustrates that the 3D content list 2010 is displayed on the display 180. The 3D content may be 3D content generated by FIG. 12 or the like, or 3D content previously stored in the image display apparatus 100.

In particular, FIG. 20A illustrates a 3D content list 2010 that includes generated 3D content 2012, 2014, 2016 through a template.

When the 3D content 2014 is selected from the 3D content list 2010 of FIG. 20A, as shown in FIG. 20B, the corresponding 3D content 2014 may be displayed as the 3D image 2020. In the drawing, it is assumed that the figure is shown in a 2D form, but is displayed as a 3D image for convenience. That is, it is assumed that the user can recognize that the inserted image or the like in the 3D content is displayed protruded.

Meanwhile, a menu related to the 3D image may be displayed on the display 180 together with the 3D image 2020. For example, a transmission object 2022 for transmitting a 3D image to an external device, a search object for searching a 3D image (2024 in FIG. 20C), and a transformation object (2026 in FIG. 20C) for converting a 2D image into a 3D image. The refresh object, the exit object, and the like may be displayed as shown in FIG. 20B. Such objects may be displayed to protrude.

Next, the controller 170 determines whether a transport object displayed on the display 180 is selected (S1915), and if applicable, displays a transmission target list including an external device that can be transmitted (S1920). When any one external device in the transmission target list is selected (S1925), it is possible to control to transmit the displayed 3D content or the image in the displayed 3D content to the corresponding external device (S1930).

As illustrated in FIG. 20B, a transmission object 2022 for transmitting a 3D image to an external device among the corresponding menus while a menu related to the 3D image is displayed on the display 180 together with the 3D image 2020. If is selected, as shown in FIG. 20C, a transmission target list 2030 including a transmittable external device may be displayed.

In FIG. 20C, the transfer target list 2030 illustrates that the 3D object is displayed so as to protrude. Referring to FIG. 20C, the transmission target list 2030 may include an external device item 2036, an object 2032 representing a transmission item 2034, a 3D image to be transmitted to the external device, or an object representing an image in the 3D image. It may further include. The object 2032 representing the 3D image to be transmitted may be included in the transmission item 2034 as shown in the figure.

When a specific external device item 2036 is selected from the transfer target list 2030, the controller 170 may transmit the 3D image displayed on the display 180 or the 3D object in the 3D image to the corresponding external device. It can be controlled to transmit an image or the like.

Alternatively, when a send item 2034 is selected while a specific external device item 2036 is selected or focused in the transmission target list 2030, the controller 170 displays the external device as a display. The controller may control to transmit a 3D image displayed on the screen 180 or an image that may be transmitted as a 3D object in the 3D image.

20E illustrates that a 3D image or the like is transmitted to the mobile device 2080, such as a cellular phone, via the network 2050. As a result, even in the mobile device 2080, the received 3D video can be easily viewed. On the other hand, the mobile device 2080 is a personalized device, it is possible to display the 3D image in a non-glasses method, not a glasses method.

Meanwhile, in addition to the mobile device 2080, the 3D image may be transmitted to an image display device such as the TV 2060 and the notebook 2070 according to the selection in the transmission target list 2030.

Meanwhile, the transmitted image may be 3D content created through a template, and may be a video instead of a still image. In particular, as described above, the image may be a 3D slide image.

Meanwhile, when transmitting the 3D image of FIG. 20C, after selecting an external device to be transmitted, a list 2040 including detailed setting items of the 3D image may be separately displayed on the display 180.

For example, when the image display apparatus 100 receives and knows the external device information to receive the 3D content, the external device changes and transmits the setting (setting) of the 3D image to be transmitted to facilitate 3D playback. It is possible to do

20D illustrates that a list 2040 related to the setting of the 3D image is displayed on the display 180. The list 2040 may include color information of the left eye image and the right eye image, format information of the 3D image, presence / absence of glasses, audio information related to the 3D image, and the like.

In the figure, an example in which the color information of the left eye image and the right eye image in the list 2040 is red and blue, respectively, is selected. Accordingly, the controller 170 may change the setting and control the transmission so that the left eye image of the 3D image may be red and the right eye image may be viewed mainly in blue.

Meanwhile, when the transmitted image is 3D content created through a template, the entire 3D content may be transmitted, but each image in the 3D content may be separately transmitted.

21A-21C illustrate that images in 3D content transmit each to an external device separately.

As shown in FIG. 21A, when any 3D content 2014 is selected while the 3D content list 2010 is displayed on the display 180, the 3D image 2020 of the selected 3D content is displayed as shown in FIG. 21B. Can be.

In this case, when the transmission object 2022 for transmitting the 3D image to the external device is selected, as illustrated in FIG. 21C, the transmission target list 2030 including the transmittable external device may be displayed.

In this case, when any one of the transmission target items 2036 of the transmission target list 2030 is selected, the entire 3D image 2025 to be displayed is not transmitted to the mobile device corresponding to the selected transmission target item. Each image included in the content may be transmitted.

Since three images are provided in the 3D image 2025 of FIG. 21C, the video display device 100 sequentially or simultaneously displays three images 2142, 2144, and 2146 excluding an external background. It can be transmitted to (2180).

After receiving the three images 2142, 2144, and 2146, the mobile device 2180 generates a 3D content by executing a template for producing 3D content, and then generates the 3D content as shown in FIG. 21D. The image 2190 may be displayed on a display.

When a common template or the like is provided in different devices, only a part of an image and the like can be transmitted instead of the entire 3D content transmission, and thus data transmission can be performed easily and quickly. Also, the same 3D content can be easily viewed through other devices.

Meanwhile, FIGS. 22A to 22B illustrate transmitting 3D content or 3D content related information, which is currently being viewed, to a user or a device associated with the selected image when a specific image is selected among 3D content displays.

As shown in FIG. 22A, when a specific image 2215 is selected while the 3D content 2210 is displayed, a list 2225 related to a user of the image may be displayed. In this case, the list 2225 may include a device item or an external server item used by the corresponding user.

When any one of the items is selected, the video display device 100 corresponds to the selected item 2225, and specifically, the 3D content or 3D content related information currently being watched by the corresponding mobile device 2180. Can be transmitted. Accordingly, the mobile device 2180 may display the corresponding information 2290 on the display.

When the information 2290 is selected, the mobile device 2180 may display 3D content related to the information as a 3D image 2190 on the display, as illustrated in FIG. 22B. As such, by transmitting the 3D content or 3D content related information currently being viewed to the external device, the external device can simply view the 3D content or 3D content related information.

FIG. 23 is a flowchart illustrating a method of operating an image display apparatus according to another exemplary embodiment. FIGS. 24A to 25B are views referred to for describing various examples of the method of operating the image display apparatus of FIG. 23.

First, an image is displayed (S2305). If there is a menu display input (S2310), a menu including a conversion object for converting the 2D video into the 3D video is displayed (S2315).

The image displayed on the display may be a broadcast image or an external input image. When there is a menu display input by a user input or the like, as shown in FIG. 24A, the broadcast information 2415, the 3D transform object 2420, the application 2425, and the like may be displayed together with the broadcast image 2410.

In FIG. 24A, a transmission object, a search object, a 3D transform object 2420, a refresh object, an exit object, and the like are displayed together in an area where the broadcast information 2415 is displayed.

Meanwhile, in FIG. 24A, as an example of the application 2425, an application item that can be broadcasted and viewed, a messenger window, and another application viewing item are illustrated.

On the other hand, when the transform object is selected (S2320), the video display device 100 determines whether a 3D video corresponding to the 2D video exists in the video display device (S2325). If it does not exist, the 2D image is converted into a 3D image (S2330). The converted 3D image is displayed (S2335). On the other hand, if the 3D image is present, the 3D image is immediately displayed (S2335) without a separate conversion procedure.

When the conversion object 2420 of FIG. 24A is selected, the controller 170 checks whether there is a 3D image related to the 2D image in the image display apparatus or the connected external device, and if not, displays the 2D image. It can be converted to 3D video and controlled to display.

In the drawing, the conversion object 2420 is selected using the pointer 205 of the remote control apparatus 200.

That is, as shown in FIG. 24B, the broadcast information 2417, the application 2427, and the like other than the broadcast video 2430 may be displayed to protrude as a 3D object. Of course, on the contrary, the broadcast image 2430 may be displayed to protrude.

On the other hand, 2D image to 3D image conversion, by applying the edge algorithm of the 2D image, etc., the object is extracted, the depth is given to the extracted object, respectively, corresponding to the depth, left eye image and right eye image For each object within, it can be performed by having a corresponding parallax. The conversion to the 3D image may be performed by the controller 170, and in particular, may be performed by the formatter 340.

As described above, the converted object 2420 displayed on the display 180 may be used to easily convert the 2D image being viewed into the 3D image and to display the converted 3D image.

25A-25B are similar to FIGS. 24A-24B but illustrate that another menu is displayed on the display 180.

By the menu display input, as shown in FIG. 25A, the broadcast video 2510, the broadcast information 2515, and the simple menu 2525 may be displayed.

The simple menu 2525 may include a home menu item, a simple image channel item, a 3D conversion item 2520, a simple setting item, and the like.

In the drawing, the conversion object 2520 is selected using the pointer 205 of the remote control apparatus 200.

When the transform object 2320 of FIG. 25A is selected, as shown in FIG. 25B, the broadcast image may be converted into a 3D image 2530 including the 3D object 2535 and displayed.

The image display device and the method of operating the same according to the present invention are not limited to the configuration and method of the embodiments described as described above, but the embodiments are all or all of the embodiments so that various modifications can be made. Some may be optionally combined.

Meanwhile, the operation method of the image display apparatus of the present invention can be implemented as a code that can be read by a processor on a recording medium readable by a processor included in the image display apparatus. The processor-readable recording medium includes all kinds of recording apparatuses in which data that can be read by the processor is stored. Examples of the recording medium that can be read by the processor include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet . In addition, the processor-readable recording medium may be distributed over network-connected computer systems so that code readable by the processor in a distributed fashion can be stored and executed.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

Claims (20)

Displaying a template for generating 3D content;
Inserting and displaying the selected image in an image-insertable object in the template when an image for image insertion is selected; And
And displaying 3D content including the inserted image as a 3D image when the 3D content view input is received.
The method of claim 1,
Displaying an image list for inserting an image corresponding to the selected object when an object capable of inserting an image is selected in the template;
Inserting and displaying the selected image into the selected object,
And when one image of the image list is selected, inserting the selected image into the selected object and displaying the selected image.
The method of claim 1,
The template,
And a left eye image template and a right eye image template.
The method of claim 3,
The left eye image template and the right eye image template,
And at least one object capable of inserting an image, and position information of the corresponding object.
The method of claim 1,
The 3D image display step,
And when a plurality of images is added to the generated 3D content, displaying at least a part of the plurality of images as a 3D slide image.
The method of claim 5,
And, when the 3D slide image is displayed, the plurality of images are sequentially highlighted or sequentially displayed as 3D objects having a corresponding depth.
The method of claim 1,
And if the selected image is inserted into the selected object and displayed, varying the position according to an input for adjusting the position of the image.
The method of claim 1,
Receiving a depth setting input of the inserted image when the selected image is inserted and displayed in the selected object;
The 3D image display step,
And displaying the content as a 3D image in response to a depth setting input of the inserted image.
The method of claim 1,
Displaying a template list including at least one template;
The template display step of displaying the selected template in the tabbed list, characterized in that the operation method of the video display device.
The method of claim 1,
Displaying a transmission object for transmitting the 3D image to an external device;
If the transport object is selected, displaying a transmission target list including an external device that can be transmitted; And
And transmitting the 3D image or the image in the 3D image to a corresponding external device when a predetermined external device in the transmission target list is selected.
The method of claim 1,
The transfer target list,
And an object representing the 3D image or the image in the 3D image to be transmitted to an external device.
Displaying a transmission object for transmitting a 3D image to an external device;
If the transport object is selected, displaying a transmission target list including an external device that can be transmitted; And
And transmitting the 3D image or the image in the 3D image to the corresponding external device when a predetermined external device in the transmission target list is selected.
The method of claim 12,
The transfer target list,
And an object representing the 3D image or the image in the 3D image to be transmitted to an external device.
The method of claim 12,
Selecting a 3D image; And
And displaying the selected 3D image.
And the transmission object is displayed together with the 3D image.
The method of claim 12,
Displaying a 2D image;
Displaying a transformation object for converting the 2D image into the 3D image; And
When the transform object is selected, converting the 2D image into a 3D image and displaying the converted 3D image;
And the 3D image to be transmitted or the 3D image including the transmitted image is the converted 3D image.
Displaying a 2D image;
Displaying a transformation object for converting the 2D image into the 3D image;
Converting the 2D image into a 3D image when the transform object is selected; And
And displaying the converted 3D image.
17. The method of claim 16,
The conversion step,
And when there is no 3D image corresponding to the 2D image in the image display apparatus or in an external device connected to the image display apparatus.
A display displaying a template for generating 3D content; And
And a control unit which controls to insert and display the selected image in an image-insertable object in the template when an image for image insertion is selected.
The control unit,
And displaying 3D content including the inserted image as a 3D image when there is a 3D content view input.
A display indicating a transmission object for transmitting a 3D image to an external device; And
And a control unit which controls to display a transmission target list including the external device that can be transmitted when the transport object is selected.
The control unit,
And when a predetermined external device in the transmission target list is selected, transmitting the 3D image or the image in the 3D image to the corresponding external device.
A display for displaying a 2D image; And
And a controller configured to display a conversion object for converting the 2D image into the 3D image.
The control unit,
And converting the 2D image into a 3D image when the converted object is selected.
KR1020120014917A 2012-02-14 2012-02-14 Image display apparatus, and method for operating the same KR20130093364A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120014917A KR20130093364A (en) 2012-02-14 2012-02-14 Image display apparatus, and method for operating the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120014917A KR20130093364A (en) 2012-02-14 2012-02-14 Image display apparatus, and method for operating the same

Publications (1)

Publication Number Publication Date
KR20130093364A true KR20130093364A (en) 2013-08-22

Family

ID=49217681

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120014917A KR20130093364A (en) 2012-02-14 2012-02-14 Image display apparatus, and method for operating the same

Country Status (1)

Country Link
KR (1) KR20130093364A (en)

Similar Documents

Publication Publication Date Title
EP2547112B1 (en) Image display apparatus and method for operating the same
US9024875B2 (en) Image display apparatus and method for operating the same
KR20120034996A (en) Image display apparatus, and method for operating the same
US20130057541A1 (en) Image display apparatus and method for operating the same
KR101951326B1 (en) Image display apparatus, server and method for operating the same
US20130070063A1 (en) Image display apparatus and method for operating the same
KR101836846B1 (en) Image display apparatus, and method for operating the same
KR101746808B1 (en) Image display apparatus, media apparatus and method for operating the same
KR101929484B1 (en) Image display apparatus, and method for operating the same
KR20130093364A (en) Image display apparatus, and method for operating the same
KR101890323B1 (en) Image display apparatus, settop box and method for operating the same
KR101945811B1 (en) Image display apparatus, and method for operating the same
KR101825669B1 (en) Image display apparatus, and method for operating the same
KR20140089794A (en) Image display apparatus and method for operating the same
KR20130030603A (en) Image display apparatus, and method for operating the same
KR101878806B1 (en) Image display apparatus, and method for operating the same
KR101882214B1 (en) Image display apparatus, server and method for operating the same
KR101946585B1 (en) Image display apparatus, and method for operating the same
KR20140047427A (en) Image display apparatus and method for operating the same
KR102014149B1 (en) Image display apparatus, and method for operating the same
KR20130071149A (en) Image display apparatus, and method for operating the same
KR20130120255A (en) Image display apparatus, and method for operating the same
KR20130016986A (en) Image display apparatus, and method for operating the same
KR20140044181A (en) Image display apparatus and method for operating the same
KR20140062255A (en) Image display apparatus and method for operating the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination