{"id":243,"date":"2019-07-25T12:11:28","date_gmt":"2019-07-25T19:11:28","guid":{"rendered":"http:\/\/www.blueflagiris.com\/blueflag_wordpress\/?page_id=243"},"modified":"2019-08-02T11:09:59","modified_gmt":"2019-08-02T18:09:59","slug":"patents","status":"publish","type":"page","link":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/patents\/","title":{"rendered":"Patents"},"content":{"rendered":"<p>[et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;Page Title Section&#8221; _builder_version=&#8221;3.26.3&#8243; background_color_gradient_direction=&#8221;115deg&#8221; background_image=&#8221;https:\/\/otm.uic.edu\/wp-content\/uploads\/sites\/66\/2018\/01\/patent-seal.jpg&#8221; custom_padding=&#8221;150px||0|&#8221;][et_pb_row custom_padding=&#8221;27px|40px||10%&#8221; custom_margin=&#8221;|||0px&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;rgba(99,132,108,0.81)&#8221; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221; border_radii=&#8221;||6px|6px|&#8221; border_width_right=&#8221;30px&#8221; border_color_right=&#8221;#bdada0&#8243; box_shadow_style=&#8221;preset3&#8243; box_shadow_vertical=&#8221;35px&#8221; box_shadow_blur=&#8221;70px&#8221; box_shadow_spread=&#8221;-35px&#8221; box_shadow_color=&#8221;rgba(0,0,0,0.6)&#8221; max_width=&#8221;960px&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;2%&#8221; use_custom_width=&#8221;on&#8221; custom_width_px=&#8221;960px&#8221;][et_pb_column type=&#8221;4_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding_tablet=&#8221;|||10%&#8221; custom_padding_last_edited=&#8221;off|desktop&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text admin_label=&#8221;Title&#8221; _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; header_font=&#8221;Abhaya Libre|700|||||||&#8221; header_font_size=&#8221;70px&#8221; header_line_height=&#8221;1.2em&#8221; header_4_font=&#8221;||||||||&#8221; background_layout=&#8221;dark&#8221; custom_margin=&#8221;|||&#8221; custom_padding=&#8221;|||&#8221; header_font_size_tablet=&#8221;40px&#8221; header_font_size_last_edited=&#8221;on|tablet&#8221;]<\/p>\n<h2>Granted Patents<\/h2>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;Events Section&#8221; _builder_version=&#8221;3.22&#8243; background_image=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/paper.jpg&#8221; custom_padding=&#8221;70px||70px|&#8221;][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US10297034 &#8211; Systems and methods for fusing images<\/h3>\n<h5>Inventor: James Wilson Nash, Kalin Mitkov Atanassov, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A method performed by an electronic device is described. The method includes obtaining a first image from a first camera, the first camera having a first focal length and a first field of view. The method also includes obtaining a second image from a second camera, the second camera having a second focal length and a second field of view disposed within the first field of view. The method further includes aligning at least a portion of the first image and at least a portion of the second image to produce aligned images. The method additionally includes fusing the aligned images based on a diffusion kernel to produce a fused image. The diffusion kernel indicates a threshold level over a gray level range. The method also includes outputting the fused image. The method may be performed for each of a plurality of frames of a video feed.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US10297034-Systems-and-methods-for-fusing-images.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-6q&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: May. 21, 2019<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9986223 &#8211; Folded optic passive depth sensing system<\/h3>\n<h5>Inventor: Sergiu Radu Goma, Todor Georgiev Georgiev, Biay-Cheng Hseih, Zheng-Wu Li, Wen-Yu Sun<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Certain aspects relate to systems and techniques for folded optic stereoscopic imaging, wherein a number of folded optic paths each direct a different one of a corresponding number of stereoscopic images toward a portion of a single image sensor. Each folded optic path can include a set of optics including a first light folding surface positioned to receive light propagating from a scene along a first optical axis and redirect the light along a second optical axis, a second light folding surface positioned to redirect the light from the second optical axis to a third optical axis, and lens elements positioned along at least the first and second optical axes and including a first subset having telescopic optical characteristics and a second subset lengthening the optical path length. The sensor can be a three-dimensionally stacked backside illuminated sensor wafer and reconfigurable instruction cell array processing wafer that performs depth processing.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9986223-Folded-optic-passive-depth-sensing-system.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-6o&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Mar. 29, 2018<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9883119 &#8211; Method and system for hardware-based motion sensitive HDR image processing<\/h3>\n<h5>Inventor: Kalin Mitkov Atanassov, Sergiu Radu Goma, Stephen Michael Verrall, Albrecht Johannes Lindner<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">System and methods for performing motion sensitive high dynamic range (HDR) image processing. A saturation analysis circuit is configured to receive a set of image data corresponding to portions of a set of image frames having different exposures time from a lowest exposure time to a highest exposure time, and select image data from a frame that does not exceed the saturation threshold value. A motion detection circuit may be configured to determine whether the image data is associated with movement, by comparing image data from pairs of frames of adjacent exposure times, and changing the selection to a lower exposure time frame if movement is detected. By selecting which exposure time is used based upon movement in the image frame, ghosting and blurring in HDR images containing movement can be reduced.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9883119-Method-and-system-for-hardware-based-motion-sensitive-HDR-image-processing.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-6m&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Jan. 30, 2018<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9786705 &#8211; Solid state image sensor with extended spectral response<\/h3>\n<h5>Inventor: Sergiu Radu Goma, Biay-Cheng Hseih, Todor Georgiev Georgiev<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Various embodiments are directed to an image sensor that includes a first sensor portion and a second sensor portion coupled to the first sensor portion. The second sensor portion may be positioned relative to the first sensor portion so that the second sensor portion may initially detect light entering the image sensor, and some of that light passes through the second sensor portion and is be detected by the first sensor portion. In some embodiments, the second sensor portion may be configured to have a thickness suitable for sensing visible light. The first sensor portion may be configured to have a thickness suitable for sensing IR or NIR light. As a result of the arrangement and structure of the second sensor portion and the first sensor portion, the image sensor captures substantially more light from the light source.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9786705-Solid-state-image-sensor-with-extended-spectral-response.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-6i&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Oct. 10, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US10194089 &#8211; Systems and methods for implementing seamless zoom function using multiple cameras<\/h3>\n<h5>Inventor: James Wilson Nash, Kalin Mitkov Atanassov, Sergiu Radu Goma, Narayana Karthik Sadanandam Ravirala, Venkata Ravi Kiran Dayana, Karthikeyan Shanmugavadivelu<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Devices and methods for providing seamless preview images for multi-camera devices having two or more asymmetric cameras. A multi-camera device may include two asymmetric cameras disposed to image a target scene. The multi-camera device further includes a processor coupled to a memory component and a display, the processor configured to retrieve an image generated by a first camera from the memory component, retrieve an image generated by a second camera from the memory component, receive input corresponding to a preview zoom level, retrieve spatial transform information and photometric transform information from memory, modify at least one image received from the first and second cameras by the spatial transform and the photometric transform, and provide on the display a preview image comprising at least a portion of the at least one modified image and a portion of either the first image or the second image based on the preview zoom level.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US10194089-Systems-and-methods-for-implementing-seamless-zoom-function-using-multiple-cameras.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-6g&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Jan. 29, 2019<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9681109 &#8211; Systems and methods for configurable demodulation<\/h3>\n<h5>Inventor: Hasib Ahmed Siddiqui, Kalin Mitkov Atanassov, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Exemplary embodiments are directed to configurable demodulation of image data produced by an image sensor. In some aspects, a method includes receiving information indicating a configuration of the image sensor. In some aspects, the information may indicate a configuration of sensor elements and\/or corresponding color filters for the sensor elements. A modulation function may then be generated based on the information. In some aspects, the method also includes demodulating the image data based on the generated modulation function to determine chrominance and luminance components of the image data, and generating the second image based on the determined chrominance and luminance components.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9681109-Systems-and-methods-for-configurable-demodulation.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-68&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: June 13, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>EP3248374B1 &#8211; Method and apparatus for multiple technology depth map acquisition and fusion<\/h3>\n<h5>Inventor: Albrecht Johannes LINDNER, Kalin Mitkov ATANASSOV, Sergiu Radu GOMA<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">BACKGROUND<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">The present application relates generally to a generation of a depth map of a field of view (FOV) and, more specifically, to systems, methods, and devices for automatically generating a fused or aggregate depth map of the FOV that is configured to compensate for weaknesses that may be introduced by individually generated depth maps.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/EP3248374B1-Method-and-apparatus-for-multiple-technology-depth-map-acquisition-and-fusion.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-6e&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Feb. 27, 2019<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9681109 &#8211; Systems and methods for configurable demodulation<\/h3>\n<h5>Inventor: Hasib Ahmed Siddiqui, Kalin Mitkov Atanassov, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Exemplary embodiments are directed to configurable demodulation of image data produced by an image sensor. In some aspects, a method includes receiving information indicating a configuration of the image sensor. In some aspects, the information may indicate a configuration of sensor elements and\/or corresponding color filters for the sensor elements. A modulation function may then be generated based on the information. In some aspects, the method also includes demodulating the image data based on the generated modulation function to determine chrominance and luminance components of the image data, and generating the second image based on the determined chrominance and luminance components.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9681109-Systems-and-methods-for-configurable-demodulation.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-68&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: June 13, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9698191 &#8211; System and method to extend near infrared spectral response for imaging systems<\/h3>\n<h5>Inventor: Biay-Cheng Hseih, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">One innovation includes an IR sensor having an array of sensor pixels to convert light into current, each sensor pixel of the array including a photodetector region, a lens configured to focus light into the photodetector region, the lens adjacent to the photodetector region so light propagates through the lens and into the photodetector region, and a substrate disposed with photodetector region between the substrate and the lens, the substrate having one or more transistors formed therein. The sensor also includes reflective structures positioned between at least a portion of the substrate and at least a portion of the photodetector region and such that at least a portion of the photodetector region is between the one or more reflective structures and the lens, the one or more reflective structures configured to reflect the light that has passed through at least a portion of the photodetector region into the photodetector region.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9698191-System-and-method-to-extend-near-infrared-spectral-response-for-imaging-systems.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-66&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: July 4, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9948920 &#8211; Systems and methods for error correction in structured light<\/h3>\n<h5>Inventor: James Wilson Nash, Kalin Mitkov Atanassov, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Systems and methods for error correction in structured light are disclosed. In one aspect, a method includes receiving, via a receiver sensor, a structured light image of at least a portion of a composite code mask encoding a plurality of codewords, the image including an invalid codeword. The method further includes detecting the invalid codeword. The method further includes generating a plurality of candidate codewords based on the invalid codeword. The method further includes selecting one of the plurality of candidate codewords to replace the invalid codeword. The method further includes generating a depth map for an image of the scene based on the selected candidate codeword. The method further includes generating a digital representation of a scene based on the depth map. The method further includes outputting the digital representation of the scene to an output device.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9948920-Systems-and-methods-for-error-correction-in-structured-light.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-62&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Apr. 17, 2018<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9819863 &#8211; Wide field of view array camera for hemispheric and spherical imaging<\/h3>\n<h5>Inventor: Thomas Wesley Osborne, Todor Georgiev Georgiev, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Aspects relate to methods and systems for producing ultra-wide field of view images. In some embodiments, an image capture system for capturing wide field-of-view images comprises an aperture, a central camera positioned to receive light through the aperture, the center camera having an optical axis, a plurality of periphery cameras disposed beside the central camera and pointed towards a portion of the optical axis of the center camera, the plurality of cameras arranged around the center camera, and a plurality of extendible reflectors. The reflectors are configured to move from a first position to a second position and have a mirrored first surface that faces away from the optical axis of the center camera and a second black surface that faces towards the optical axis of the center camera, the plurality of periphery cameras arranged around the center camera.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9819863-Wide-field-of-view-array-camera-for-hemispheric-and-spherical-imaging.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5Y&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 14, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>ES2690270T3 &#8211; Automatic synchronization of multiple cameras depth by sharing time<\/h3>\n<h5>Inventor: Sergiu Radu GOMA, Kalin Mitkov ATANASSOV<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A detection device depth (100a-d) to capture an image containing depth information of a scene, comprising: a transmitter (200) capable of projecting light onto a scene, the transmitter comprising a laser (210) capable of producing a light beam (230) including a series of laser pulses, each pulse having a pulse length and the number of pulses produced in a pulse frequency, and a receiver (300) coupled to the transmitter (200) in an orientation relative known, the receiver comprising a shutter and a set of sensors (315) capable of producing an image based on the detection light projected by the transmitter (200) and reflected from the scene; and a controller comprising a processor (350), the controller coupled to the transmitter (200) and receiver (300), the controller configured to: determine, based on a first image receiver (300), the indicative light at least one depth detection device (100a-d) is present in the scene; in response to determining that the light is present in the scene based on the first image: i) control the pulse length of the series of laser pulses to avoid interfering with the light that is determined to be present in the scene, ii ) adjusting an exposure period of the shutter to synchronize with the length of controlled pulse, iii) determine, based on a next image receiver (300), if the light is still present in the scene, iv) in response to determining that the light is still present in the scene, delaying the onset of exposure window and continue to determine, based on a next image of the plurality of images of the receiver, if the light is still present in the scene; and in response to determining that the light is not present in the scene, starting the exposure window, with the exposure window comprising producing a laser pulse of the series of laser pulses to the length of controlled pulse and activate the shutter for the exposure period so that the receiver (300) detecting the laser pulse reflected from the scene.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/ES2690270T3-Automatic-synchronization-of-multiple-cameras-depth-by-sharing-time.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5W&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 20, 2018<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US10068338 &#8211; Active sensing spatial resolution improvement through multiple receivers and code reuse<\/h3>\n<h5>Inventor: Kalin Mitkov Atanassov, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Structured light active sensing systems transmit and receive spatial codes to generate depth maps. Spatial codes can&#8217;t be repeated within a disparity range if they are to be uniquely identified. This results in large numbers of codes for single transmitter\/single receiver systems, because reflected ray traces from two object locations may be focused onto the same location of the receiver sensor, making it impossible to determine which object location reflected the code. However, the original code location may be uniquely identified because ray traces from the two object locations that focus onto the same location of the first receiver sensor may focus onto different locations on the second receiver sensor. Described herein are active sensing systems and methods that use two receivers to uniquely identify original code positions and allow for greater code reuse.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US10068338-Active-sensing-spatial-resolution-improvement-through-multiple-receivers-and-code-reuse.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5U&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Sep. 4, 2018<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9386222 &#8211; Multi-camera system using folded optics free from parallax artifacts<\/h3>\n<h5>Inventor: Todor Georgiev Georgiev, Thomas Wesley Osborne, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Aspects relate to an array camera exhibiting little or no parallax artifacts in captured images. For example, the planes of the central mirror surfaces of the array camera can be located at a midpoint along, and orthogonally to, a line between the corresponding camera location and the virtual camera location. Accordingly, the cones of all of the cameras in the array appear as if coming from the virtual camera location after folding by the mirrors. Each sensor in the array \u201csees\u201d a portion of the image scene using a corresponding facet of the central mirror prism, and accordingly each individual sensor\/mirror pair represents only a sub-aperture of the total array camera. The complete array camera has a synthetic aperture generated based on the sum of all individual aperture rays.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9386222-Multi-camera-system-using-folded-optics-free-from-parallax-artifacts.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5S&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: July 5, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9294672 &#8211; Multi-camera system using folded optics free from parallax and tilt artifacts<\/h3>\n<h5>Inventor: Todor Georgiev GeorgievSergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Aspects relate to an array camera exhibiting little or no parallax artifacts in captured images. For example, the planes of the central mirror prism of the array camera can intersect at an apex defining the vertical axis of symmetry of the system. The apex can serve as a point of intersection for the optical axes of the sensors in the array. Each sensor in the array \u201csees\u201d a portion of the image scene using a corresponding facet of the central mirror prism, and accordingly each individual sensor\/mirror pair represents only a sub-aperture of the total array camera. The complete array camera has a synthetic aperture generated based on the sum of all individual aperture rays.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9294672-Multi-camera-system-using-folded-optics-free-from-parallax-and-tilt-artifacts.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5O&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Mar. 22, 2016<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9774801 &#8211; Solid state image sensor with enhanced charge capacity and dynamic range<\/h3>\n<h5>Inventor: Biay-Cheng Hseih, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Certain aspects relate to imaging systems and methods for manufacturing imaging systems and image sensors. The imaging system includes a pixel array including a plurality of pixels, the pixels configured to generate a charge when exposed to light and disposed on a first layer. The imaging system further includes a plurality of pixel circuits for reading light integrated in the pixels coupled thereto, each of the plurality of pixel circuits comprising one or more transistors shared between a subset of the plurality of the pixels, the one or more transistors disposed on a second layer different than the first layer. The imaging system further includes a plurality of floating diffusion nodes configured to couple each of the plurality of pixels to the plurality of pixel circuits.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9774801-Solid-state-image-sensor-with-enhanced-charge-capacity-and-dynamic-range.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5K&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Sep. 26, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9560296 &#8211; Pixel readout architecture for full well capacity extension<\/h3>\n<h5>Inventor: Biay-Cheng Hseih, Jiafu Luo, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">Abstract<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Certain aspects relate to systems and techniques for full well capacity extension. For example, a storage capacitor included in the pixel readout architecture can enable multiple charge dumps from a pixel in the analog domain, extending the full well capacity of the pixel. Further, multiple reads can be integrated in the digital domain using a memory, for example DRAM, in communication with the pixel readout architecture. This also can effectively multiply a small pixel&#8217;s full well capacity. In some examples, multiple reads in the digital domain can be used to reduce, eliminate, or compensate for kTC noise in the pixel readout architecture.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9560296-Pixel-readout-architecture-for-full-well-capacity-extension.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5G&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Jan. 31, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>EP2920546B1 &#8211; Structured light active depth sensing systems combining multiple images to compensate for differences in reflectivity and\/or absorption<\/h3>\n<h5>Inventor: Judit Martinez Bauza, Kalin Mitkov ATANASSOV, Sergiu Radu GOMA<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\">BACKGROUND\u00a0FIELD<\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Various features pertain to active depth sensing and more specifically to techniques to compensate for different reflectivity\/absorption coefficients of objects in a scene when performing active depth sensing system using structured light.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/EP2920546B1-Structured-light-active-depth-sensing-systems-combining-multiple-images-to-compensate-for-differences-in-reflectivity-and_or-absorption-1.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5A&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 30, 2016<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9398264 &#8211; Multi-camera system using folded optics<\/h3>\n<h5>Inventor:\u00a0Todor G. Georgiev, Thomas Wesley Osborne, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Described herein are methods and devices that employ a plurality of image sensors to capture a target image of a scene. As described, positioning at least one reflective or refractive surface near the plurality of image sensors enables the sensors to capture together an image of wider field of view and longer focal length than any sensor could capture individually by using the reflective or refractive surface to guide a portion of the image scene to each sensor. The different portions of the scene captured by the sensors may overlap, and may be aligned and cropped to generate the target image.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9398264-Multi-camera-system-using-folded-optics.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5n&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: July 19, 2016<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9207070 &#8211; Transmission of affine-invariant spatial mask for active depth sensing<\/h3>\n<h5>Inventor: Kalin Mitkov Atanassov, James Wilson Nash, Vikas Ramachandra, Sergiu Radu Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A method operational on a transmitter device is provided for projecting a composite code mask. A composite code mask on a tangible medium is obtained, where the composite code mask includes a code layer combined with a carrier layer. The code layer may include uniquely identifiable spatially-coded codewords defined by a plurality of symbols. The carrier layer may be independently ascertainable and distinct from the code layer and includes a plurality of reference objects that are robust to distortion upon projection. At least one of the code layer and carrier layer may be pre-shaped by a synthetic point spread function prior to projection. At least a portion of the composite code mask is projected, by the transmitter device, onto a target object to help a receiver ascertain depth information for the target object with a single projection of the composite code mask.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9207070-Transmission-of-affine-invariant-spatial-mask-for-active-depth-sensing.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5h&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 28, 2013<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9075473 &#8211; Interactive display with removable front panel<\/h3>\n<h5>Inventor: Daniel Moses, Robert Mitchell Kleiman, Sergiu Radu Goma, Milivoje Aleksic, Sergio Lopez<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Described are methods and apparatus for adjusting images of a stereoscopic image pair. The methods and apparatus may capture a first and second image with first and second imaging sensors. The two imaging sensors have intrinsic and extrinsic parameters. A normalized focal distance of a reference imaging sensor may also be determined based on intrinsic and extrinsic parameters. A calibration matrix is then adjusted based on the normalized focal distance. The calibration matrix may be applied to an image captured by an image sensor.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9075473-Interactive-display-with-removable-front-panel.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5d&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: July. 7, 2015<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9275459 &#8211; Method and apparatus for calibrating an imaging device<\/h3>\n<h5>Inventor: Sergiu R Goma, Kalin Mitkov Atanassov, Vikas Ramachandra<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Described are methods and apparatus for adjusting images of a stereoscopic image pair. The methods and apparatus may capture a first and second image with first and second imaging sensors. The two imaging sensors have intrinsic and extrinsic parameters. A normalized focal distance of a reference imaging sensor may also be determined based on intrinsic and extrinsic parameters. A calibration matrix is then adjusted based on the normalized focal distance. The calibration matrix may be applied to an image captured by an image sensor.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9560334-Methods-and-apparatus-for-improved-cropping-of-a-stereoscopic-image-pair-1.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-59&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Mar. 1, 2016<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9438889 &#8211; System and method for improving methods of manufacturing stereoscopic image sensors<\/h3>\n<h5>Inventor: Kalin Mitkov Atanassov, Sergiu R Goma, Vikas Ramachandra, Milivoje Aleksic<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Described herein are methods, systems and apparatus to improve imaging sensor production yields. In one method, a stereoscopic image sensor pair is provided from a manufacturing line. One or more images of a correction pattern are captured by the image sensor pair. Correction angles of the sensor pair are determined based on the images of the correction pattern. The correction angles of the sensor pair are represented graphically in a three dimensional space. Analysis of the graphical representation of the correction angles through statistical processing results in a set of production correction parameters that may be input into a manufacturing line to improve sensor pair yields.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9438889-System-and-method-for-improving-methods-of-manufacturing-stereoscopic-image-sensors.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-57&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Sep. 6, 2016<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9560334 &#8211; Methods and apparatus for improved cropping of a stereoscopic image pair<\/h3>\n<h5>Inventor: Vikas Ramachandra, Kalin Mitkov Atanassov, Sergiu R. Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Described herein are methods and apparatus to adjust the convergence point of a stereoscopic image pair captured by an imaging device. In one method, a first image and a second image of a stereoscopic image pair are provided, and then shifting or cropping of the first image is performed to align the first and second image. This shifting or cropping is performed while preserving the second image. The method then includes determining a target horizontal image disparity based on a desired convergence point of the stereoscopic image pair and when the target horizontal disparity is greater than a predetermined maximum, the cropping of an outside dimension of the first image is limited. In some implementations it is limited to the predetermined maximum.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9560334-Methods-and-apparatus-for-improved-cropping-of-a-stereoscopic-image-pair.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-53&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Jan. 31, 2017<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9485495 &#8211; Autofocus for stereo images<\/h3>\n<h5>Inventor: Kalin M. Atanassov; Sergiu R. Goma; Vikas Ramachandra<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Present embodiments contemplate systems, apparatus, and methods to determine an appropriate focal depth for a sensor based upon a pair of stereoscopic images. Particularly, certain of the embodiments contemplate determining keypoints for each image, identifying correlations between the keypoints, and deriving object distances from the correlations. These distances may then be used to select a proper focal depth for one or more sensors.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9485495-Autofocus-for-stereo-images.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4Z&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 1, 2016<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US9001227 &#8211; Combining data from multiple image sensors<\/h3>\n<h5>Inventor: Sergiu R. Goma; Milivoje Aleksic; Hau Hwang; Joseph Cheung<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A method of combining data from multiple sensors is disclosed. The method includes providing a common control signal to multiple image sensors. Each of the multiple image sensors is responsive to the common control signal to generate image data. The method also includes receiving synchronized data output from each of the multiple image sensors.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US9001227-Combining-data-from-multiple-image-sensors.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-5r&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Apr. 7, 2015<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US8896668 &#8211; Combining data from multiple image sensors<\/h3>\n<h5>Inventor: Sergiu R. Goma; Milivoje Aleksic; Hau Hwang; Joseph Cheung<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\"><span>A method of combining data from multiple sensors is disclosed. The method includes receiving lines of image data at an image processor having an input for a single camera. Each line of the image data includes first line data from a first image captured by a first camera and second line data from a second image captured by a second camera. The method also includes generating an output frame having a first section corresponding to line data of the first image and having a second section corresponding to line data of the second image. The first section and the second section are configured to be used to generate a three-dimensional (3D) image format or a 3D video format.<\/span><\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US8896668-Combining-data-from-multiple-image-sensors.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4V&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 25, 2014<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US8970672 &#8211; Three-dimensional image processing<\/h3>\n<h5>Inventor: Sergiu R. Goma; Kalin M. Atanassov; Milivoje Aleksic<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Systems and methods of 3D image processing are disclosed. In a particular embodiment, a three-dimensional (3D) media player is configured to receive input data including at least a first image corresponding to a scene and a second image corresponding to the scene and to provide output data to a 3D display device. The 3D media player is responsive to user input including at least one of a zoom command and a pan command. The 3D media player includes a convergence control module configured to determine a convergence point of a 3D rendering of the scene responsive to the user input.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US8970672-Three-dimensional-image-processing.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4S&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Mar. 3, 2015<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US8736695 &#8211; Parallel image processing using multiple processors<\/h3>\n<h5>Inventor: Hau Hwang; Joseph Cheung; Sergiu R. Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">An electronic device for parallel image processing using multiple processors is disclosed. The electronic device includes multiple image sensors for providing image data. The electronic device also includes multiple processors for processing segmented image data to produce processed segmented image data. Each processor is dedicated to one of the image sensors. A multiple processor interface is also included. The multiple processor interface maps the image data to the processors, segments the image data to produce the segmented image data and synchronizes the segmented image data to processor clock rates.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US8736695-Parallel-image-processing-using-multiple-processors.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4M&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: May. 27, 2014<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US8761531 &#8211; Image data compression involving sub-sampling of luma and chroma values<\/h3>\n<h5>Inventor: Sergiu R. Goma<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">Chroma values in image data may be sub-sampled, thereby obtaining sub-sampled chroma values. The sub-sampled chroma values may be compressed, thereby obtaining compressed, sub-sampled chroma values. Luma values in the image data may be compressed, thereby obtaining sub-sampled luma values. Edge information for the luma values that are discarded as part of the luma sub-sampling operation may be determined.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US20110007979-Image-data-compression-involving-sub-sampling-of-luma-and-chroma-values.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4K&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: June. 24, 2014<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US8223796 &#8211; Graphics multi-media IC and method of its operation<\/h3>\n<h5>Inventor: Fariborz Pourbigharaz; Sergiu Goma; Milivoje Aleksic; Andrzej Mamona<\/h5>\n<h5>Current Assignee: ATI Technologies ULC<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A graphics multi-media integrated circuit (GMIC) is connected to a host processor over two serial links: a half duplex bi-directional serial link which accords to a protocol defined for a display serial interface, and a uni-directional serial link which accords to a compatible protocol defined for a camera serial interface. The GMIC receives packets according to the protocol from the host over the half duplex bi-directional serial link and processes these packets. The GMIC sends packets according to the protocol to the host over the uni-directional serial link. A packet from the host can request a processing operation by the GMIC or can initiate a memory operation at the memory of the GMIC. The GMIC can also send packets to the host to initiate a memory operation at the memory of the host. The GMIC may be connected to a display over a bi-directional serial link according to the display serial interface protocol and to a camera over a uni-directional serial link and a bi-directional control link according to the camera serial interface so that the host controls the display and camera indirectly through the GMIC.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US20090315899-Graphics-multi-media-ic-and-method-of-its-operation.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4I&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: July. 17, 2012<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US7571271 &#8211; Lane merging<\/h3>\n<h5>Inventor: Sergiu Goma; Fariborz Pourbigharaz; Milivoje Aleksic<\/h5>\n<h5>Current Assignee: ATI Technologies ULC<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A buffer is associated with each of a plurality of data lanes of a multi-lane serial data bus. Data words are timed through the buffers of active ones of the data lanes. Words timed through buffers of active data lanes are merged onto a parallel bus such that data words from each of the active data lanes are merged onto the parallel bus in a pre-defined repeating sequence of data lanes. This approach allows other, non-active, data lanes to remain in a power conservation state.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US7571271-Lane-merging.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4n&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Aug. 4, 2009<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US8063957 &#8211; Method and apparatus for processing bad pixels<\/h3>\n<h5>Inventor: Sergiu Goma, Milivoje Aleksic<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A technique for processing at least one bad pixel occurring in an image sensing system is provided. Dynamic bad pixel detection is performed on a plurality of streaming pixels taking from at least one controlled image and value and coordinate information for each bad pixel is subsequently stored as stored bad pixel information. Thereafter, static bad pixel correction may be performed based on the stored bad pixel information. The stored bad pixel information may be verified based on histogram analysis performed on the plurality of streaming pixels. The technique for processing bad pixels in accordance with the present invention may be embodied in suitable circuitry or, more broadly, within devices incorporating image sensing systems.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US8063957-Method-and-apparatus-for-processing-bad-pixels.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4B&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Nov. 22, 2011<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US7876970 &#8211; Method and apparatus for white balancing digital images<\/h3>\n<h5>Inventor: Sergiu Goma, Milivoje Aleksic<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">At least one illuminant white point estimate is determined in a color space having radially defined saturation based on a reference image. A chromatic adaptation correction vector (CACV) is determined based on the at least one illuminant white point estimate. Corrected pixels are obtained by applying the CACV (preferably in a cone response color space using a correction matrix based on the CACV) to uncorrected image pixels corresponding to a target image, which may comprise the reference image or another image.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US7876970-Method-and-apparatus-for-white-balancing-digital-images.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4p&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Jan. 25, 2011<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3>US7599569 &#8211; Method and apparatus for bilateral high pass filter<\/h3>\n<h5>Inventor: Maxim Smirnov, Milivoje Aleksic, Sergiu Goma<\/h5>\n<h5>Current Assignee: Ati Technologies, Ulc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\"><span>A target pixel and surrounding pixels corresponding to the target pixel are obtained from a digitally represented image. A bilateral high pass filtering kernel is determined based at least in part upon the target pixel and the surrounding pixels. A high pass spatial filtering kernel is provided and multiplied with the high pass photometric filtering kernel to provide a bilateral high pass filtering kernel. The resulting bilateral high pass filtering kernel is thereafter applied to the target pixel and the surrounding pixels to provide a filtered pixel. When it is desirable to combine noise filtering capabilities with sharpening capabilities, the bilateral high pass filter of the present invention may be combined with a bilateral low pass filtering kernel to provide a combined noise reduction and edge sharpening filter. The present invention may be advantageously applied to a variety of devices, including cellular telephones that employ image sensing technology.<\/span><\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US7599569.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4o&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Oct. 19, 2010<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3><span>US7596743 &#8211; Method and apparatus for error management <a href=\"https:\/\/www.spiedigitallibrary.org\/profile\/Milivoje.Aleksic-69635\" class=\"ProfileTocLineItemNameText TocLineTextUnderLine\"><\/a><\/span><\/h3>\n<h5>Inventor: Sergiu Goma, Milivoje Aleksic<\/h5>\n<h5>Current Assignee: ATI Technologies ULC<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">To derive a Hamming code to manage data errors a set of at least four parity bit positions is selected for parity bits which will protect a set of data bits (where each data bit has a data bit position in the data bit set). A syndrome is determined for each data bit position. This involves selecting a unique sub-set of at least three parity bit positions. The unique sub-set shares at least one parity bit position with at least one other unique sub-set of at least three parity bit positions. A parity bit value may then be calculated for each parity bit position based on the determined syndromes. The header of a packet may be provided with a word which defines the length of the packet and an error management code generated utilizing this word so that errors in the word may be detected and, possibly, corrected.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US7596743.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-45&#8243; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Apr. 19, 2007<\/p>\n<p>This is the ECC used in MIPI-CSI and MIPI-DSI serial camera and display interfaces<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_4,1_4&#8243; custom_padding=&#8221;90px||90px|&#8221; _builder_version=&#8221;3.26.3&#8243; border_width_bottom=&#8221;1px&#8221; border_color_bottom=&#8221;rgba(0,0,0,0.12)&#8221;][et_pb_column type=&#8221;3_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; text_font_size=&#8221;16px&#8221; text_line_height=&#8221;1.8em&#8221; header_font=&#8221;||||||||&#8221; header_4_font=&#8221;Nunito|700||on|||||&#8221; header_4_text_color=&#8221;#94a6bf&#8221; header_4_letter_spacing=&#8221;2px&#8221; header_4_line_height=&#8221;1.4em&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;5%&#8221; animation_starting_opacity=&#8221;100%&#8221;]<\/p>\n<h3><span>US7818593 &#8211;\u00a0<\/span>Power conversation for devices on a shared bus using bus busy and free signals<\/h3>\n<h5>Inventor: Sergiu Goma, Millivoje Aleksic<\/h5>\n<h5>Current Assignee: Qualcomm Inc<\/h5>\n<p><span><\/span><\/p>\n<h3 class=\"row ArticleContentHeadRow\"><text class=\"ArticleContentBoldText\">Abstract<\/text><\/h3>\n<div class=\"row ArticleContentRow\"><\/div>\n<div class=\"row ArticleContentRow\" style=\"text-align: justify;\">A method of operating a shared bus comprises sending a wake-up signal on the shared bus. The wake-up signal comprises a sequence of signals, each signal of the sequence being one of a signal indicating the bus is free and a signal indicating the bus is busy. A microcontroller to effect this method is also contemplated. A wake-up device for a shared bus has a first latch to recognize a signal indicating one of said shared bus being free and said shared bus being busy and selectively output a recognition signal and a second latch to, after receipt of the recognition signal, recognize a signal indicating another of the shared bus being free and the shared bus being busy and selectively output a power-on signal. The latches may be D-type flip flops.<\/div>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;1_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;http:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-content\/uploads\/2019\/07\/US7818593-1.png&#8221; _builder_version=&#8221;3.26.3&#8243;][\/et_pb_image][et_pb_cta button_url=&#8221;https:\/\/wp.me\/ab8Jzu-4e&#8221; url_new_window=&#8221;on&#8221; button_text=&#8221;PDF&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;#94a6bf&#8221; custom_padding=&#8221;0px|0px|15px|0px||&#8221;][\/et_pb_cta][et_pb_text _builder_version=&#8221;3.26.3&#8243;]<\/p>\n<p>Date of Patent: Oct. 19, 2010<\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=&#8221;1&#8243; custom_padding_last_edited=&#8221;on|desktop&#8221; admin_label=&#8221;Upcoming Book Section&#8221; _builder_version=&#8221;3.22&#8243; use_background_color_gradient=&#8221;on&#8221; background_color_gradient_start=&#8221;rgba(56,62,89,0.93)&#8221; background_color_gradient_end=&#8221;#383e59&#8243; background_color_gradient_direction=&#8221;90deg&#8221; background_color_gradient_start_position=&#8221;60%&#8221; background_color_gradient_end_position=&#8221;0%&#8221; background_color_gradient_overlays_image=&#8221;on&#8221; background_blend=&#8221;multiply&#8221; custom_margin=&#8221;|||&#8221; custom_margin_tablet=&#8221;0px|||&#8221; custom_margin_last_edited=&#8221;off|desktop&#8221; custom_padding=&#8221;100px|0|180px|&#8221;][\/et_pb_section][et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;Footer Section&#8221; _builder_version=&#8221;3.22&#8243; background_color=&#8221;#22262d&#8221; custom_padding=&#8221;100px||100px|&#8221;][et_pb_row column_structure=&#8221;1_3,1_3,1_3&#8243; custom_padding=&#8221;70px|3%|70px|3%&#8221; custom_margin=&#8221;-10%|||&#8221; custom_padding_last_edited=&#8221;on|phone&#8221; _builder_version=&#8221;3.25&#8243; background_color=&#8221;#bcaca0&#8243; border_radii=&#8221;on|10px|10px|10px|10px&#8221; box_shadow_style=&#8221;preset2&#8243; box_shadow_horizontal=&#8221;0px&#8221; box_shadow_vertical=&#8221;35px&#8221; box_shadow_blur=&#8221;65px&#8221; box_shadow_color=&#8221;rgba(188,163,146,0.1)&#8221;][et_pb_column type=&#8221;1_3&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][\/et_pb_column][et_pb_column type=&#8221;1_3&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][\/et_pb_column][et_pb_column type=&#8221;1_3&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_blurb title=&#8221;Contact Author&#8221; use_icon=&#8221;on&#8221; font_icon=&#8221;%%131%%&#8221; icon_color=&#8221;#ffffff&#8221; use_circle=&#8221;on&#8221; circle_color=&#8221;rgba(189,173,160,0)&#8221; use_circle_border=&#8221;on&#8221; circle_border_color=&#8221;#ffffff&#8221; use_icon_font_size=&#8221;on&#8221; icon_font_size=&#8221;32px&#8221; admin_label=&#8221;Contact Author&#8221; _builder_version=&#8221;3.0.106&#8243; header_font=&#8221;Abhaya Libre|700|||||||&#8221; header_text_color=&#8221;#ffffff&#8221; header_font_size=&#8221;24px&#8221; header_line_height=&#8221;1.2em&#8221; body_font=&#8221;Nunito||||||||&#8221; body_text_color=&#8221;#ffffff&#8221; body_font_size=&#8221;18px&#8221; text_orientation=&#8221;center&#8221; locked=&#8221;off&#8221;]<\/p>\n<p>info@blueflagiris.com<\/p>\n<p>[\/et_pb_blurb][\/et_pb_column][\/et_pb_row][\/et_pb_section]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>[et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;Page Title Section&#8221; _builder_version=&#8221;3.26.3&#8243; background_color_gradient_direction=&#8221;115deg&#8221; background_image=&#8221;https:\/\/otm.uic.edu\/wp-content\/uploads\/sites\/66\/2018\/01\/patent-seal.jpg&#8221; custom_padding=&#8221;150px||0|&#8221;][et_pb_row custom_padding=&#8221;27px|40px||10%&#8221; custom_margin=&#8221;|||0px&#8221; _builder_version=&#8221;3.26.3&#8243; background_color=&#8221;rgba(99,132,108,0.81)&#8221; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221; border_radii=&#8221;||6px|6px|&#8221; border_width_right=&#8221;30px&#8221; border_color_right=&#8221;#bdada0&#8243; box_shadow_style=&#8221;preset3&#8243; box_shadow_vertical=&#8221;35px&#8221; box_shadow_blur=&#8221;70px&#8221; box_shadow_spread=&#8221;-35px&#8221; box_shadow_color=&#8221;rgba(0,0,0,0.6)&#8221; max_width=&#8221;960px&#8221; animation_style=&#8221;slide&#8221; animation_direction=&#8221;left&#8221; animation_intensity_slide=&#8221;2%&#8221; use_custom_width=&#8221;on&#8221; custom_width_px=&#8221;960px&#8221;][et_pb_column type=&#8221;4_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding_tablet=&#8221;|||10%&#8221; custom_padding_last_edited=&#8221;off|desktop&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_text admin_label=&#8221;Title&#8221; _builder_version=&#8221;3.26.3&#8243; text_font=&#8221;||||||||&#8221; header_font=&#8221;Abhaya Libre|700|||||||&#8221; header_font_size=&#8221;70px&#8221; header_line_height=&#8221;1.2em&#8221; header_4_font=&#8221;||||||||&#8221; background_layout=&#8221;dark&#8221; custom_margin=&#8221;|||&#8221; custom_padding=&#8221;|||&#8221; header_font_size_tablet=&#8221;40px&#8221; header_font_size_last_edited=&#8221;on|tablet&#8221;] Granted Patents [\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;Events Section&#8221; [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_et_pb_use_builder":"on","_et_pb_old_content":"","om_disable_all_campaigns":false,"jetpack_post_was_ever_published":false,"footnotes":""},"class_list":["post-243","page","type-page","status-publish","hentry"],"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/Pb8Jzu-3V","_links":{"self":[{"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/pages\/243","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/comments?post=243"}],"version-history":[{"count":38,"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/pages\/243\/revisions"}],"predecessor-version":[{"id":417,"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/pages\/243\/revisions\/417"}],"wp:attachment":[{"href":"https:\/\/www.blueflagiris.com\/blueflag_wordpress\/wp-json\/wp\/v2\/media?parent=243"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}