Google has been working on using radar technology with their smart devices and it started to so with their Pixel 4 smartphones. While we first covered their technology in patent form back in 2016, Google actually filed for it in back in 2014. We posted another report on their radar technology in 2019 and again in March 2020 in a report titled “Google Patent reveals plans to use their Radar technology and next-gen Soli Chip to work with Retail AR Applications and more.”
Today, the US Patent & Trademark Office published a patent application from Apple that relates to the use of radar technology in future iPhones and beyond that replaces touch displays.
Apple’s invention relates to a device, an apparatus, a method and a computer program for detecting a touch input to a surface, to a touch screen module, a touch screen apparatus, mobile terminals and a touch screen computer.
In at least some examples, the detection of the touch input may be based on using radar, e.g. by transmitting electromagnetic radiation and receiving a portion of the electromagnetic radiation reflected by nearby objects, e.g. by a finger performing the touch input.
Using radar to detect the touch input to the surface may allow the construction of thinner touch screens at a cost that may be lower than a cost of capacitive touch screens, resistive touch screens, or other touch screens.
According to Apple, the thickness would be less than those using on-cell or in-cell technology. It’s believed that the iPhone 12 higher-end models will use on-cell displays from Samsung.
Furthermore, through adjustments to a region, in which the touch may be detected, through adjustments to a temporal and/or to a spatial resolution, an energy consumption of a radar-based touch screen may be lower than an energy consumption of a capacitive touch screen. Additionally, larger touch screens may be constructed using radar technology with little or no loss to a precision of the detection of the touch.
The control module 16 may be configured to provide information related to the position of the object via an interface. The information related to the position of the object may comprise two-dimensional coordinates of the object or three-dimensional coordinates of the object, for example.
To conserve energy, the detecting of the touch input may be performed in two time intervals: In a first time interval, the surface may be coarsely and sparsely scanned for objects approaching the surface, and in the second time interval, a more precise detection (e.g. with a higher temporal resolution) may be performed to determine the position of the touch input.
The basic principle of at least some examples may be to eliminate the need for a touch screen sensor that covers the device screen. Rather than this, very small touch-screen radar (TS-radar) detectors may be placed on the edges under the display screen (two to four detectors. These radar detectors may be capable of determining the position of an object (human finger or any other object like a stylus pen) in X-Y-Z axis in sub-millimeter precision.
Apple’s patent FIG. 5 below shows a block diagram of a smart phone with four radar sensors; FIG. 6 shows a schematic diagram of a radar phased array antenna end-fire radiation pattern; and FIG. 7 shows of a schematic diagram of touch screen scanning and detecting a human finger.
If you go by Google’s patent filings, then radar can be used to introduce Air-Gesturing to control basic iPhone features.
Apple’s patent application that was published today by the U.S. Patent Office was filed back in Q1 2020. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Both of Apple’s engineers who are listed as the inventors, Mr. Igal and Mr. Ofir are from Israel.