US20180088686A1 - Domed orientationless input assembly for controlling an electronic device - Google Patents
Domed orientationless input assembly for controlling an electronic device Download PDFInfo
- Publication number
- US20180088686A1 US20180088686A1 US15/714,348 US201715714348A US2018088686A1 US 20180088686 A1 US20180088686 A1 US 20180088686A1 US 201715714348 A US201715714348 A US 201715714348A US 2018088686 A1 US2018088686 A1 US 2018088686A1
- Authority
- US
- United States
- Prior art keywords
- user
- coordinate system
- input
- input assembly
- assembly
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 45
- 230000033001 locomotion Effects 0.000 claims description 42
- 230000009471 action Effects 0.000 claims description 14
- 230000000712 assembly Effects 0.000 abstract description 9
- 238000000429 assembly Methods 0.000 abstract description 9
- 210000003811 finger Anatomy 0.000 description 38
- 238000004891 communication Methods 0.000 description 24
- 230000008569 process Effects 0.000 description 24
- 238000001514 detection method Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 230000003993 interaction Effects 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229920003207 poly(ethylene-2,6-naphthalate) Polymers 0.000 description 2
- 239000011112 polyethylene naphthalate Substances 0.000 description 2
- -1 polyethylene terephthalate Polymers 0.000 description 2
- 229920000139 polyethylene terephthalate Polymers 0.000 description 2
- 239000005020 polyethylene terephthalate Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000005033 Fourier transform infrared spectroscopy Methods 0.000 description 1
- 239000004642 Polyimide Substances 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000002646 carbon nanobud Substances 0.000 description 1
- 229910021394 carbon nanobud Inorganic materials 0.000 description 1
- 239000002041 carbon nanotube Substances 0.000 description 1
- 229910021393 carbon nanotube Inorganic materials 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 235000020803 food preference Nutrition 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000002159 nanocrystal Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 229920001721 polyimide Polymers 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
- G06F3/03544—Mice or pucks having dual sensing arrangement, e.g. two balls or two coils used to track rotation of the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- This generally relates to an input assembly for controlling an electronic device and, more particularly, to a domed orientationless and/or ambidextrous input assembly for controlling an electronic device.
- Some systems may include an electronic device with a display assembly operative to present a graphical user interface, as well as an electronic input assembly that may be manipulated by a user for generating user control signals operative to adjust the graphical user interface.
- an electronic device with a display assembly operative to present a graphical user interface
- an electronic input assembly that may be manipulated by a user for generating user control signals operative to adjust the graphical user interface.
- existing systems often limit the ways by which a user may interact with an input assembly to generate particular user control signals.
- Domed input assemblies for controlling an electronic device and methods for using domed input assemblies for controlling an electronic device are provided.
- an input assembly for controlling an electronic device, where the input assembly may include a housing structure providing an orientationless surface with respect to at least one axis of an input coordinate system of the input assembly, a sensor subassembly at least partially protected by the housing structure, and processor operative to detect, with the sensor subassembly, a user coordinate system of a user with respect to the orientationless surface, detect, with the sensor subassembly, a physical use of the housing structure, and determine a control action for the electronic device based on the user coordinate system and the physical use.
- a method for controlling an electronic device using an input assembly of a user may include detecting a user interface event that includes the user interfacing with the input assembly, after the detecting the user interface event, defining a user coordinate system for the detected user interface event from a plurality of possible user coordinate systems, after the defining the user coordinate system, detecting a physical use of the input assembly, and, based on both the defined user coordinate system and the detected physical use, determining control data that is operative to control the electronic device.
- an input assembly for controlling an electronic device, where the input assembly may include a housing structure providing a surface within an input coordinate system of the input assembly, a sensor subassembly at least partially protected by the housing structure, and a processor operative to detect, with the sensor subassembly, a user coordinate system of a user with respect to the surface from at least three possible user coordinate systems, wherein each one of the at least three possible user coordinate systems has a different orientation with respect to the input coordinate system of the input assembly, detect, with the sensor subassembly, a physical use of the housing structure, and determine a control action for the electronic device based on the user coordinate system and the physical use.
- FIG. 1 is a schematic view of an illustrative user input system including an electronic device and an electronic input assembly;
- FIG. 1A is a left side view of an exemplary input assembly interacting in a first assembly orientation with an exemplary electronic device of the system of FIG. 1 ;
- FIG. 1B is a front side view of the system of FIGS. 1 and 1A with the input assembly interacting in the first assembly orientation with the electronic device;
- FIG. 1C is a top view of the system of FIGS. 1-1B with the input assembly interacting in the first assembly orientation with the electronic device;
- FIG. 1D is a top view of the system of FIGS. 1-1C with the input assembly interacting in the first assembly orientation with the electronic device and with a user interacting in a first user orientation with the input assembly;
- FIG. 1E is a top view of the system of FIGS. 1-1D with the input assembly interacting in the first assembly orientation with the electronic device and with the user interacting in a second user orientation with the input assembly;
- FIG. 1F is a top view of the system of FIGS. 1-1E with the input assembly interacting in the first assembly orientation with the electronic device and with the user interacting in a third user orientation with the input assembly;
- FIG. 1G is a front side view of the system of FIGS. 1-1F taken from line 1 G- 1 G of FIG. 1D but without the user shown;
- FIG. 1H is a front side view of the system of FIGS. 1-1G when the input assembly has been physically manipulated by the user;
- FIG. 2A is a left side view of the system of FIGS. 1-1H with the input assembly interacting in a second assembly orientation with the electronic device;
- FIG. 2B is a front side view of the system of FIGS. 1-2A with the input assembly interacting in the second assembly orientation with the electronic device;
- FIG. 2C is a top view of the system of FIGS. 1-2B with the input assembly interacting in the second assembly orientation with the electronic device;
- FIG. 2D is a top view of the system of FIGS. 1-2C with the input assembly interacting in the second assembly orientation with the electronic device and with the user interacting in a fourth user orientation with the input assembly;
- FIG. 2E is a top view of the system of FIGS. 1-2D with the input assembly interacting in the second assembly orientation with the electronic device and with the user interacting in a fifth user orientation with the input assembly;
- FIG. 2F is a top view of the system of FIGS. 1-2E with the input assembly interacting in the second assembly orientation with the electronic device and with the user interacting in a sixth user orientation with the input assembly;
- FIG. 3 is a flowchart of an illustrative process for using an input assembly to control an electronic device.
- the present disclosure relates to input assemblies (e.g., domed orientationless and/or ambidextrous input assemblies) for controlling an electronic device and methods for using input assemblies for controlling an electronic device.
- Determination of a current user orientation with respect to an input assembly e.g., determination of a current orientation of a user coordinate system of a user with respect to an input coordinate system of an input assembly
- may be used to map particular user physical manipulations of the input assembly to particular types of control data for controlling an electronic device e.g., for controlling a cursor on a screen of the electronic device.
- This may enable consistent control data to be generated in response to a particular user physical gesture imparted by a user on an input assembly no matter the orientation of the user to input assembly and/or the orientation of the input assembly to the electronic device and/or the orientation of the user to the electronic device.
- Domed input assemblies for controlling an electronic device and methods for using domed input assemblies for controlling an electronic device are provided and described with reference to FIGS. 1-3 .
- FIG. 1 is a schematic view of an illustrative system 1 with an electronic device 100 and an electronic input assembly 200 .
- Input assembly 200 e.g., a mouse, a trackpad, user-manipulated electronic input device, hand-held input device, and the like
- electronic device 100 e.g., a tablet computer, laptop computer, desktop computer, and the like
- a system user e.g., user U of FIGS.
- 1D-1F may slide, rotate, squeeze, press, touch, or otherwise physically manipulate input assembly 200 in any suitable manner relative to electronic device 100 to convey information to electronic device 100 for controlling electronic device 100 in any suitable manner, such as for controlling or otherwise interacting with a user interface presented by electronic device 100 .
- the user interface presented by electronic device 100 may be a graphical or otherwise visual user interface that may be presented by a display output component of electronic device 100 .
- the user interface presented by electronic device 100 may be an at least partially non-visual user interface that may instead provide audible and/or tactile information to the user.
- Collectively, input assembly 200 and electronic device 100 may be referred to herein as a “user input” system 1 .
- system 1 may be operative to determine and/or estimate a current user orientation of at least one body part of user U (e.g., a palm, wrist, set of fingers, etc.) with respect to at least one physical feature of input assembly 200 as well as to detect a current physical manipulation of input assembly 200 by user U.
- System 1 may also be operative to use both the determined user orientation and the detected physical manipulation to define particular control command for controlling electronic device 100 (e.g., for controlling a user interface presented by electronic device 100 ).
- System 1 may be operative to determine three or more particular user orientations of user U with respect to input assembly 200 , such as to determine that a current user orientation is a particular one of at least three possible user orientations that may be detectable by system 1 .
- system 1 may be configured to detect any greater number of possible user orientations, such as 30, 60, 90, 120, 180, 240, 270, 360, 480, 540, 720, and/or any other suitable possible user orientations with respect to input assembly (e.g., any degree orientation of 360 degree orientations of user U's left hand with respect to input assembly 200 and any degree orientation of 360 degree orientations of user U's right hand with respect to input assembly 200 ).
- any greater number of possible user orientations such as 30, 60, 90, 120, 180, 240, 270, 360, 480, 540, 720, and/or any other suitable possible user orientations with respect to input assembly (e.g., any degree orientation of 360 degree orientations of user U's left hand with respect to input assembly 200 and any degree orientation of 360 degree orientations of user U's right hand with respect to input assembly 200 ).
- system 1 may be operative to detect any suitable number of types of physical manipulation by user U of input assembly 200 , such as any planar movement of the entirety of input assembly 200 in a plane (e.g., along a work surface), any planar movement of a portion of input assembly 200 with respect to another portion of input assembly 200 , any rotational movement of a portion of input assembly 200 with respect to another portion of input assembly 200 , any touch gestures along a surface of input assembly 200 , and/or the like.
- any suitable number of types of physical manipulation by user U of input assembly 200 such as any planar movement of the entirety of input assembly 200 in a plane (e.g., along a work surface), any planar movement of a portion of input assembly 200 with respect to another portion of input assembly 200 , any rotational movement of a portion of input assembly 200 with respect to another portion of input assembly 200 , any touch gestures along a surface of input assembly 200 , and/or the like.
- a shape of at least an external portion of input assembly 200 may be a dome or other suitable shape that may be operative to present a similar external surface to user U no matter which of three or more user orientations user U may have with respect to that external surface, such that user U may more easily orient itself with respect to input assembly 200 no matter the position of user U with respect to input assembly 200 (e.g., as compared to an input assembly shaped to provide only one or two comfortable orientations with respect to the user). Therefore, system 1 may be configured to enable user U to control electronic device 100 in a similar fashion using the same physical manipulation of input assembly 200 no matter which one of three or more user orientations user U may have with respect to input assembly 200 .
- Electronic device 100 may be any portable, mobile, or hand-held electronic device configured to receive control signals from input assembly 200 for controlling a user interface of electronic device 100 .
- electronic device 100 may not be portable at all, but may instead be generally stationary.
- Electronic device 100 can include, but is not limited to, a media player, video player, still image player, game player, other media player, music recorder, movie or video camera or recorder, still camera, other media recorder, radio, medical equipment, domestic appliance, transportation vehicle instrument, musical instrument, calculator, cellular telephone (e.g., an iPhoneTM available by Apple Inc.), other wireless communication device, personal digital assistant, remote control, pager, computer (e.g., a desktop, laptop, tablet, server, etc.), monitor, television, stereo equipment, set up box, set-top box, wearable device (e.g., an Apple WatchTMby Apple Inc.), boom box, modem, router, printer, and combinations thereof.
- Electronic device 100 may include any suitable control circuitry or processor 102 , memory 104 , communications component 106 , power supply 108 , input component 110 , and output component 112 .
- Electronic device 100 may also include a bus 114 that may provide one or more wired or wireless communication links or paths for transferring data and/or power to, from, or between various other components of device 100 .
- Device 100 may also be provided with a housing 101 that may at least partially enclose one or more of the components of device 100 for protection from debris and other degrading forces external to device 100 .
- one or more of the components may be provided within its own housing (e.g., output component 112 may be an independent display within its own housing that may wirelessly or through a wire communicate with processor 102 , which may be provided within its own housing).
- one or more components of electronic device 100 may be combined or omitted.
- electronic device 100 may include other components not combined or included in FIG. 1 .
- device 100 may include any other suitable components or several instances of the components shown in FIG. 1 . For the sake of simplicity, only one of each of the components is shown in FIG. 1 .
- Memory 104 may include one or more storage mediums, including for example, a hard-drive, flash memory, permanent memory such as read-only memory (“ROM”), semi-permanent memory such as random access memory (“RAM”), any other suitable type of storage component, or any combination thereof.
- Memory 104 may include cache memory, which may be one or more different types of memory used for temporarily storing data for electronic device applications.
- Memory 104 may store media data (e.g., music and image files), software (e.g., applications for implementing functions on device 100 (e.g., virtual drawing space or other user interface applications)), firmware, preference information (e.g., media playback preferences), lifestyle information (e.g., food preferences), exercise information (e.g., information obtained by exercise monitoring equipment), transaction information (e.g., information such as credit card information), wireless connection information (e.g., information that may enable device 100 to establish a wireless connection), subscription information (e.g., information that keeps track of podcasts or television shows or other media a user subscribes to), contact information (e.g., telephone numbers and e-mail addresses), calendar information, any other suitable data, or any combination thereof.
- media data e.g., music and image files
- software e.g., applications for implementing functions on device 100 (e.g., virtual drawing space or other user interface applications)
- firmware e.g., media playback preferences
- lifestyle information e.g.,
- Communications component 106 may be provided to allow device 100 to communicate with one or more other electronic devices or servers or subsystems (e.g., input assembly 200 ) using any suitable communications protocol(s).
- communications component 106 may support Wi-Fi (e.g., an 802.11 protocol), Ethernet, BluetoothTM, near field communication (“NFC”), radio-frequency identification (“RFID”), high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communication systems), infrared, transmission control protocol/internet protocol (“TCP/IP”) (e.g., any of the protocols used in each of the TCP/IP layers), hypertext transfer protocol (“HTTP”), BitTorrentTM, file transfer protocol (“FTP”), real-time transport protocol (“RTP”), real-time streaming protocol (“RTSP”), secure shell protocol (“SSH”), any other communications protocol, or any combination thereof.
- Communications component 106 may also include circuitry that can enable device 100 to be electrically coupled to another device or server or subsystem (e.g., input assembly 200
- Power supply 108 may provide power to one or more of the components of device 100 .
- power supply 108 can be coupled to a power grid (e.g., when device 100 is not a portable device, such as a desktop computer).
- power supply 108 can include one or more batteries for providing power (e.g., when device 100 is a portable device, such as a cellular telephone).
- power supply 108 can be configured to generate power from a natural source (e.g., solar power using solar cells).
- One or more input components 110 may be provided to permit a user to interact or interface with device 100 and/or to sense certain information about the ambient environment.
- input component 110 can take a variety of forms, including, but not limited to, a touch pad, trackpad, dial, click wheel, scroll wheel, touch screen, one or more buttons (e.g., a keyboard), mouse, joy stick, track ball, switch, photocell, force-sensing resistor (“FSR”), encoder (e.g., rotary encoder and/or shaft encoder that may convert an angular position or motion of a shaft or axle to an analog or digital code), microphone, camera, scanner (e.g., a barcode scanner or any other suitable scanner that may obtain product identifying information from a code, such as a linear barcode, a matrix barcode (e.g., a quick response (“QR”) code), or the like), proximity sensor (e.g., capacitive proximity sensor), biometric sensor (e.g., a fingerprint reader or other feature recognition sensor, which may
- Electronic device 100 may also include one or more output components 112 that may present information (e.g., graphical, audible, and/or tactile information) to a user of device 100 .
- An output component of electronic device 100 may take various forms, including, but not limited to, audio speakers, headphones, data and/or power line-outs, visual displays (e.g., for transmitting data via visible light and/or via invisible light), antennas, infrared ports, flashes (e.g., light sources for providing artificial light for illuminating an environment of the device), tactile/haptic outputs (e.g., rumblers, vibrators, etc.), taptic components (e.g., components that are operative to provide tactile sensations in the form of vibrations), and any combinations thereof.
- electronic device 100 may include a display as output component 112 .
- Display 112 may include any suitable type of display or interface for presenting visual data to a user.
- display 112 may include a display embedded in device 100 or coupled to device 100 (e.g., a removable display).
- Display 112 may include, for example, a liquid crystal display (“LCD”), a light emitting diode (“LED”) display, an organic light-emitting diode (“OLED”) display, a surface-conduction electron-emitter display (“SED”), a carbon nanotube display, a nanocrystal display, an organic electroluminescence display, electronic ink, or another type of display technology or combination of display technology types.
- LCD liquid crystal display
- LED light emitting diode
- OLED organic light-emitting diode
- SED surface-conduction electron-emitter display
- carbon nanotube display a nanocrystal display
- organic electroluminescence display electronic ink, or another type of display technology or combination of display technology
- display 112 can include a movable display or a projecting system for providing a display of content on a surface remote from electronic device 100 , such as, for example, a video projector, a head-up display, or a three-dimensional (e.g., holographic) display.
- display 112 may include a digital or mechanical viewfinder, such as a viewfinder of the type found in compact digital cameras, reflex cameras, or any other suitable still or video camera.
- display 112 may include display driver circuitry, circuitry for driving display drivers, or both.
- Display 112 can be operative to display content (e.g., media playback information, application screens for applications implemented on electronic device 100 , information regarding ongoing communications operations, information regarding incoming communications requests, device operation screens, etc.) that may be under the direction of processor 102 .
- Display 112 can be associated with any suitable characteristic dimensions defining the size and shape of the display.
- the display can be rectangular or have any other polygonal shape, or alternatively can be defined by a curved or other non-polygonal shape (e.g., a circular display).
- Display 112 can have one or more primary orientations for which an interface can be displayed, or can instead or in addition be operative to display an interface along any orientation selected by a user.
- one or more input components 110 and one or more output components 112 may sometimes be referred to collectively herein as an input/output (“I/O”) component or I/O interface 111 (e.g., input component 110 and display 112 as I/O component or I/O interface 111 ).
- I/O input/output
- input component 110 and display 112 may sometimes be a single I/O component 111 , such as a touch screen, that may receive input information through a user's and/or stylus' touch of a display screen and that may also provide visual information to a user via that same display screen.
- Input component 110 of electronic device 100 may provide an input surface relative to which a system user may manipulate the orientation and position of stylus 400 to convey information to electronic device 100 .
- such an input surface of input component 110 of electronic device 100 may be provided as a portion of a multi-touch display screen assembly (e.g., as a portion of I/O interface 111 with a display output component 112 ).
- such an input surface of input component 110 of electronic device 100 may be a non-display input surface, such as, but not limited to, a trackpad or drawing tablet, whether or not device 100 may also include a display output component.
- Processor 102 of device 100 may include any processing circuitry operative to control the operations and performance of one or more components of electronic device 100 .
- processor 102 may be used to run one or more applications, such as an application 103 .
- Application 103 may include, but is not limited to, one or more operating system applications, firmware applications, user interface applications, media playback applications, media editing applications, pass applications, calendar applications, state determination applications (e.g., device state determination applications, input assembly state determination applications, etc.), biometric feature-processing applications, compass applications, health applications, thermometer applications, weather applications, thermal management applications, force sensing applications, device diagnostic applications, video game applications, or any other suitable applications.
- processor 102 may load application 103 as a user interface program or any other suitable program to determine how instructions or data received via an input component 110 and/or any other component of device 100 (e.g., input assembly data from input assembly 200 via communications component 106 , etc.) may manipulate the one or more ways in which information may be stored on device 100 (e.g., in memory 104 ) and/or provided to a user via an output component 112 and/or to a remote subsystem (e.g., to input assembly 200 via communications component 106 ).
- processor 102 may load application 103 as a user interface program or any other suitable program to determine how instructions or data received via an input component 110 and/or any other component of device 100 (e.g., input assembly data from input assembly 200 via communications component 106 , etc.) may manipulate the one or more ways in which information may be stored on device 100 (e.g., in memory 104 ) and/or provided to a user via an output component 112 and/or to a remote subsystem (e.g., to
- Application 103 may be accessed by processor 102 from any suitable source, such as from memory 104 (e.g., via bus 114 ) or from another device or server (e.g., from input assembly 200 via communications component 106 , and/or from any other suitable remote source via communications component 106 ).
- Electronic device 100 e.g., processor 102 , memory 104 , or any other components available to device 100
- Processor 102 may include a single processor or multiple processors.
- processor 102 may include at least one “general purpose” microprocessor, a combination of general and special purpose microprocessors, instruction set processors, graphics processors, video processors, and/or related chips sets, and/or special purpose microprocessors.
- Processor 102 also may include on board memory for caching purposes.
- Processor 102 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions.
- process 102 can be a microprocessor, a central processing unit, an application-specific integrated circuit, a field-programmable gate array, a digital signal processor, an analog circuit, a digital circuit, or combination of such devices.
- Processor 102 may be a single-thread or multi-thread processor.
- Processor 102 may be a single-core or multi-core processor. Accordingly, as described herein, the term “processor” may refer to a hardware-implemented data processing device or circuit physically structured to execute specific transformations of data including data operations represented as code and/or instructions included in a program that can be stored within and accessed from a memory. The term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, or other suitably configured computing element or combination of elements.
- Input assembly 200 may be any suitable electronic user input tool, mouse, trackpad, user-manipulated electronic input device, hand-held input device, and/or the like that may be configured to provide control signals or other input to electronic device 100 .
- Input assembly 200 may include any suitable control circuitry or processor 202 , which may be similar to any suitable processor 102 of device 100 , application 203 , which may be similar to any suitable application 103 of device 100 , memory 204 , which may be similar to any suitable memory 104 of device 100 , communications component 206 , which may be similar to any suitable communications component 106 of device 100 , power supply 208 , which may be similar to any suitable power supply 108 of device 100 , input component 210 , which may be similar to any suitable input component 110 of device 100 , output component 212 , which may be similar to any suitable output component 112 of device 100 , I/O interface 211 , which may be similar to any suitable I/O interface 111 of device 100 , bus 214 , which may be similar to any suitable bus
- input assembly 200 may be combined or omitted.
- input assembly 200 may include other components not combined or included in FIG. 1 .
- input assembly 200 may include any other suitable components or several instances of the components shown in FIG. 1 .
- Input assembly 200 and electronic device 100 may be operative to communicate any suitable data 99 (e.g., control data signals) between communication components 206 and 106 using any suitable communication protocol(s).
- FIGS. 1A-2F show input assembly 200 positioned on work surface 5 .
- Input assembly 200 may be provided with a dome housing structure 201 d positioned above a base housing structure 201 b , while various components of input assembly 200 (e.g., process 202 , memory 204 , communications component 206 , one or more input components 210 , one or more output components 212 , and/or the like may be at least partially positioned in a space between dome structure 201 d and base structure 201 b .
- Input assembly 200 may include one or more feet or pad elements extending downwardly from base structure 201 b for contacting work surface 5 and/or for supporting the remainder of input assembly 200 thereabove.
- input assembly 200 may include four feet F, R, B, and L that may be offset from one another and extending downwardly from different portions of base structure 201 b for providing balanced support for input assembly 200 on work surface 5 .
- Dome structure 201 d may be any suitable shape, such as a spherical dome (e.g., a portion of a sphere cut off by a plane), a spheroidal dome (e.g., a section of a spheroid that provides a dome that is circularly symmetric (e.g., with an axis IA of rotation)), an ellipsoidal dome (e.g., a portion of an ellipsoid cut off by a plane normal to a symmetry axis IA of the ellipsoid), and/or the like, such that the shape of exterior surface 201 s of dome structure 201 d may be symmetrical about axis IA.
- a spherical dome e.g., a portion of a sphere cut off by a plane
- a spheroidal dome e.g., a section of a spheroid that provides a dome that is circularly symmetric (
- exterior surface 201 s may be a symmetrical curved dome surface about axis IA (e.g., a dome structure of any suitable diameter (e.g., 6 inches to 8 inches or any other suitable range)).
- axis IA e.g., a dome structure of any suitable diameter (e.g., 6 inches to 8 inches or any other suitable range)
- input assembly 200 may provide a structure of a consistent shape when viewed from any line of sight perpendicular to axis IA (e.g., as shown by the similarities between FIGS. 1A and 1B that may provide the same curvature C of surface 201 s ).
- input assembly 200 may be described as having a particular assembly orientation, as various sensor input components 210 or other components of assembly 200 may be positioned within housing 201 in a particular orientation and/or calibrated with a particular orientation. Therefore, feet F, R, B, and L along with axis IA may be used herein to provide some context to a particular input assembly orientation or coordinate system.
- such distinct feet may not exist or may not be visible to an end user (e.g., such feet may be hidden by a portion of base structure 201 b that may extend about all such feet and contact or almost contact surface 5 ), whereby there may be no features visible to an end user operative to structurally and/or visually distinguish one orientation of assembly 200 from another orientation of assembly 200 with respect to surface 5 about an axis (e.g., axis IA).
- assembly 200 may be any other suitable shape that may not be symmetrical about axis IA (e.g., surface 201 s may be rectangular and/or a cuboid or any other suitable two-dimensional or three-dimensional shape).
- FIGS. 1A-2F depict input assembly 200 positioned on work surface 5 with a particular orientation with respect to at least one portion of electronic device 100 .
- electronic device 100 may include a display output component 112 with a planar display screen 112 s with an outer surface that may exist in an X-Z plane of an X-Y-Z three-dimensional Cartesian coordinate system DC (device coordinate system DC), while work surface 5 may exist in an X-Y plane of device coordinate system DC (although any other suitable relationship between surface 5 and screen 112 s may exist).
- System 1 may be configured such that electronic device 100 may be displaying at least an input assembly controlled pointer or cursor 112 c on display screen 112 s that may be controlled by data 99 received from input assembly 200 based on user U interaction with input assembly 200 .
- any suitable features of assembly 200 may define an axis of an input coordinate system.
- feet F and B may together define an axis FBA that may be operative to define a Y-axis of an input coordinate system NC
- axis IA may define a Z-axis of input coordinate system NC
- an axis between feet L and R may define a Y-axis of input coordinate system NC.
- any other suitable feature(s) and/or component(s) may be operative to define input coordinate system NC (e.g., any suitable axis/axes of any suitable sensing mechanism(s) of a sensor input component 210 of assembly 200 ).
- input assembly 200 and, thus, input coordinate system NC may be positioned in a first input orientation with respect to display screen 112 s and, thus, with respect to device coordinate system DC.
- the first input orientation of assembly 200 with respect to device 100 may include input coordinate system NC of assembly 200 aligned with device coordinate system DC (e.g., axis X of input coordinate system NC is aligned with axis X of device coordinate system DC (e.g., feet L and R may be define a X-axis of input coordinate system NC and may be aligned along a X-axis of device coordinate system DC with foot R further in the +X direction), and an axis Y of input coordinate system NC (e.g., axis FBA) is aligned with axis Y of device coordinate system DC (e.g., with foot F further in the +Y direction closest to display screen 112 s ), and an axis Z of input coordinate system NC (e.g.,
- input assembly 200 may be positioned with any other suitable second input orientation with respect to display screen 112 s (e.g., housing 201 may be rotated 45° in the direction of arrow CW (clockwise) about axis Z of device coordinate system DC, whereby feet L and F may be equidistant from display screen 112 s ). Therefore, as shown in FIGS.
- the second input orientation of assembly 200 and input coordinate system NC with respect to device 100 and device coordinate system DC may include input coordinate system NC of assembly 200 offset by 45° in the direction of arrow CW (e.g., about its Z-axis (e.g., axis IA)) from device coordinate system DC (e.g., axis X of input coordinate system NC is offset with axis X of device coordinate system DC by 45° in the direction of arrow CW and axis Y of input coordinate system NC is offset from axis Y of device coordinate system DC by 45° in the direction of arrow CW).
- axis X of input coordinate system NC is offset with axis X of device coordinate system DC by 45° in the direction of arrow CW
- axis Y of input coordinate system NC is offset from axis Y of device coordinate system DC by 45° in the direction of arrow CW.
- input assembly 200 may be positioned at any suitable orientation with respect to display screen 112 s or any other portion of device 100 on surface 5 (e.g., input coordinate system NC may be oriented in any suitable orientation with respect to device coordinate system DC) and input assembly 200 may still provide a structure of a substantially consistent shape (e.g., absent feet F, R, B, and L) when viewed from any line of sight perpendicular to axis IA (e.g., as shown by the similarities between FIGS. 1A, 1B, 2A, and 2B ).
- a substantially consistent shape e.g., absent feet F, R, B, and L
- user U may create any user orientation between a body part of user U and exterior surface 201 s of dome structure 201 d (e.g., about axis IA) and still feel the same shape of dome structure 201 d .
- dome structure 201 d For example, no matter whether user U positions its hand on top of exterior surface 201 s of dome structure 201 d at a first user orientation of FIG. 1D with respect to input assembly 200 , at a second user orientation of FIG. 1E with respect to input assembly 200 , or at a third user orientation of FIG. 1F with respect to input assembly 200 , user U may feel the same or substantially similar structural shape of dome structure 201 d . This may enable user U to “blindly” reach for and interact with input assembly 200 , no matter the user orientation with input assembly 200 , and still receive the same structural feedback from input assembly 200 (e.g., input assembly 200 may feel the same to user U no matter which of many various user orientations user U's hand may have with respect to input assembly 200 ).
- the first user orientation may include user U's right hand positioned on top of exterior surface 201 s such that a right middle finger axis RMA of user U's right middle finger may be axially aligned with feet F and B of input assembly 200 and, thus, with axis Y of input coordinate system NC and, thus, with axis Y of device coordinate system DC due to the first orientation of assembly 200 and system NC with respect to device 100 and system DC.
- Axis RMA may be operative to define a Y-axis of a user coordinate system UC, which may include an X-Y plane substantially parallel to surface 5 . Therefore, in FIG.
- the first user orientation may include user coordinate system UC of user U's right hand aligned with device coordinate system DC (e.g., axis X of user coordinate system UC is aligned with axis X of device coordinate system DC and axis Y of user coordinate system UC is aligned with axis Y of device coordinate system DC) and with input coordinate system NC.
- user coordinate system UC of user U's right hand aligned with device coordinate system DC e.g., axis X of user coordinate system UC is aligned with axis X of device coordinate system DC and axis Y of user coordinate system UC is aligned with axis Y of device coordinate system DC
- input coordinate system NC As shown in FIG.
- the second user orientation may include user U's right hand positioned on top of exterior surface 201 s such that right middle finger axis RMA may be axially offset by 45° in the direction of arrow CW from a line extending through feet F and B of input assembly 200 and, thus, axially offset by 45° in the direction of arrow CW with axis Y of each one of device coordinate system DC and input coordinate system NC. Therefore, in FIG.
- the second user orientation may include user coordinate system UC of user U's right hand offset by 45° in the direction of arrow CW from input coordinate system NC and from device coordinate system DC (e.g., axis X of user coordinate system UC is offset by 45° in the direction of arrow CW from axis X of device coordinate system DC and axis Y of user coordinate system UC is offset by 45° in the direction of arrow CW from with axis Y of device coordinate system DC). While, as shown in FIG.
- the third user orientation may include user U's right hand positioned on top of exterior surface 201 s such that right middle finger axis RMA may be axially offset by 45° in the direction of arrow CCW (counter clockwise) from a line extending through feet F and B of input assembly 200 and, thus, axially offset by 45° in the direction of arrow CCW with axis Y of each one of device coordinate system DC and input coordinate system NC. Therefore, in FIG.
- the second user orientation may include user coordinate system UC of user U's right hand offset by 45° in the direction of arrow CCW from input coordinate system NC and from device coordinate system DC (e.g., axis X of user coordinate system UC is offset by 45° in the direction of arrow CCW from axis X of device coordinate system DC and axis Y of user coordinate system UC is offset by 45° in the direction of arrow CCW from with axis Y of device coordinate system DC).
- axis X of user coordinate system UC is offset by 45° in the direction of arrow CCW from axis X of device coordinate system DC
- axis Y of user coordinate system UC is offset by 45° in the direction of arrow CCW from with axis Y of device coordinate system DC.
- This may enable user U to “blindly” reach for and interact with input assembly 200 , no matter the user orientation with input assembly 200 and no matter the orientation of input assembly 200 to device 100 , and still receive the same structural feedback from input assembly 200 (e.g., input assembly 200 may feel the same to user U no matter which of many various user orientations user U's hand may have with respect to input assembly 200 ). For example, as shown in FIG.
- the fourth user orientation may include user U's right hand positioned on top of exterior surface 201 s such that the Y-axis of user coordinate system UC (e.g., right middle finger axis RMA) may be axially offset by 45° in the direction of arrow CCW from the Y-axis of input coordinate system NC of input assembly 200 (e.g., a line extending through feet F and B) but axially aligned with axis Y of device coordinate system DC. As shown in FIG.
- the fifth user orientation may include user U's right hand positioned on top of exterior surface 201 s such that the Y-axis of user coordinate system UC (e.g., right middle finger axis RMA) may be axially aligned with the Y-axis of input coordinate system NC of input assembly 200 (e.g., a line extending through feet F and B) but axially offset by 45° in the direction of arrow CW with axis Y of device coordinate system DC, while, as shown in FIG.
- the Y-axis of user coordinate system UC e.g., right middle finger axis RMA
- NC of input assembly 200 e.g., a line extending through feet F and B
- the sixth user orientation may include user U's right hand positioned on top of exterior surface 201 s such that the Y-axis of user coordinate system UC (e.g., right middle finger axis RMA) may be axially offset by 90° in the direction of arrow CCW from the Y-axis of input coordinate system NC of input assembly 200 but only axially offset by 45° in the direction of arrow CCW from the Y-axis of device coordinate system DC of device 100 . Yet, no matter which of these or any other similar user orientations user U's right hand may have with respect to the top of exterior surface 201 s (e.g., with respect to axis IA), exterior surface 201 s may feel the same to user U.
- the Y-axis of user coordinate system UC e.g., right middle finger axis RMA
- a particular orientation of user coordinate system UC of user U with respect to input coordinate system NC of assembly 200 may be the same as or different than a particular orientation of user coordinate system UC of user U with respect to device coordinate system DC of device 100 .
- System 1 may be configured to determine a current user orientation of user U (e.g., of user coordinate system UC) with respect to assembly 200 (e.g., with respect to input coordinate system NC) using any suitable sensing components of system 1 .
- any suitable senor input components 210 of assembly 200 e.g., hover or near touch sensors or any suitable touch sensors that may be integrated into and/or under and/or about exterior surface 201 s
- any suitable senor input components 210 of assembly 200 may be operative to detect the orientation of one or more body parts (e.g., palm, hand, one or more fingers, etc.) of user U on top of (e.g., physically against or hovering above but not physically touching) exterior surface 201 s to determine the orientation of user coordinate system UC of user U with respect to input coordinate system NC of assembly 200 (e.g., to determine the current user orientation of right middle finger axis RMA of user U's right hand and/or of a left middle finger axis of user U's left hand
- various sensors may be used to accurately determine a current relationship between a user's hand and a mouse (e.g., by comparing detected user features with baseline features).
- various sensors may be used to accurately determine a location of one or more fingers on a surface of a device.
- any suitable user characteristic or characteristics may be identified in order to determine the current user orientation of user U with respect to input assembly 200 (e.g., spacing and/or orientation between finger tips, palm print, and/or the like (e.g., average direction of each detected finger of a hand of a user with respect to any suitable dimension of assembly 200 ))
- identification of a middle finger e.g., right middle finger axis RMA of user U for a right hand of a user
- identification of user coordinate system UC may enable identification of user coordinate system UC and may be referred to herein for determining a current user orientation of user U to input assembly 200 .
- Such user orientation determination may then be used to map any detected physical manipulations of input assembly 200 by user U to particular commands that may appropriately control the user interface.
- system 1 may be operative to determine axis RMA and, thus, to determine user coordinate system UC that may use axis RMA as its Y-axis. Then, system 1 may be operative to detect any user physical manipulations of input assembly 200 by the user with respect to determined user coordinate system UC, and then to consistently map such physical manipulations within user coordinate system UC to user interface manipulations within device coordinate system DC.
- Determination of a current user orientation with respect to input assembly 200 may then be used to map particular user physical manipulations of input assembly 200 to particular types of control data 99 that may be communicated to device 100 for controlling device 100 (e.g., for controlling cursor 112 c on screen 112 s ).
- user coordinate system UC may be determined by input assembly 200 for a current user position or a current user orientation of user U with respect to input assembly 200 (e.g., current orientation of user coordinate system UC with respect to input coordinate system NC)
- a physical manipulation of input assembly 200 as initiated by user U from that determined current user position or orientation may be detected with respect to the determined user coordinate system UC of that current user position and then that detected physical manipulation with respect to user coordinate system UC may be mapped to user interface manipulations with device coordinate system DC for generating the appropriate control data 99 , no matter what the orientation is of system UC with respect to system NC.
- Such a process may yield consistent control data for the same physical manipulation of assembly 200 by a user with respect to a determined user coordinate system UC, no matter what the relationship between that user coordinate system UC and input coordinate system NC, and/or no matter what the relationship between that user coordinate system UC and device coordinate system DC, and/or no matter what the relationship between input coordinate system NC and device coordinate system DC.
- FIG. 3 is a flowchart of an illustrative process 300 for using an input assembly for controlling an electronic device, such as for using input assembly 200 for controlling electronic device 100 .
- the input assembly may determine whether a user interface event has been detected. If a user interface event is not detected at operation 302 , then operation 302 may be repeated until a user interface event is detected or until any suitable interrupt of process 300 may be received. However, if a user interface event is detected at operation 302 , then process 300 may advance to operation 304 , where the input assembly may define a current user coordinate system based on the user interface event detected at operation 302 .
- any suitable type of interaction by a user with an input assembly may be detected as a user interface event at operation 302 , including, but not limited to, detection of at least a portion of a user's hand in physical contact with any suitable surface (e.g., surface 201 s ) of the input assembly (e.g., instantaneous detection or detection of substantially consistent contact for a threshold period of time (e.g., 300 milliseconds, 500 milliseconds, etc.), etc.), detection of at least a portion of a user's hand hovering adjacent to (e.g., within a threshold distance of) but not contacting the input assembly (e.g., instantaneous detection or detection of substantially consistent hovering for a threshold period of time (e.g., 300 milliseconds, 500 milliseconds, etc.), etc.), and/or the like.
- a threshold period of time e.g. 300 milliseconds, 500 milliseconds, etc.
- a user interface event may be detected (e.g., at operation 302 ) and system 1 may attempt to detect a current user coordinate system (e.g., system UC of FIG. 1D (e.g., at operation 304 )).
- the input assembly may define a current user coordinate system, such as user coordinate system UC, based on the user interface event detected at operation 302 .
- detection of at least a user's middle finger in contact with surface 201 s may be a user interface event detected at operation 302 , and then an appropriate user coordinate system UC may be defined by the input assembly at operation 304 based on that detected position of the user's middle finger.
- a relationship between the current input coordinate system NC of input assembly 200 at operation 304 and the current orientation of the user U with respect to that system NC as may be used to define user coordinate system UC of user U at operation 304 may be any suitable relationship (e.g., aligned, offset about axis IA by 45 °, offset about axis IA by 90° , etc.
- the relationship between current input coordinate system NC of input assembly 200 at operation 304 and the current user coordinate system UC at operation 304 may be utilized internally by input assembly 200 to track movement of the user with respect to input assembly 200 .
- process 300 may advance to operation 306 , where the input assembly may determine whether a reset interface event has been detected. If a reset interface event is detected at operation 306 , then process 300 may return to operation 302 until a user interface event is detected or until any suitable interrupt of process 300 may be received. For example, if a reset interface event is detected, process 300 may return to operation 302 or operation 304 to potentially define a new current user coordinate system. However, if a reset interface event is not detected at operation 306 , then process 300 may advance to operation 308 , where the input assembly may determine whether any physical use of the input assembly has been detected.
- any suitable type of interaction by a user with an input assembly may be detected as a reset interface event at operation 306 , including, but not limited to, detection of at least a portion or the entirety of a user's hand terminating physical contact with any suitable surface (e.g., surface 201 s ) of the input assembly (e.g., instantaneous detection of no contact or detection of consistent lack of contact for a threshold period of time (e.g., 300 milliseconds, 500 milliseconds, etc.), etc.), detection of no movement of a user's hand or of at least a portion of a user's hand for at least a threshold period of time (e.g., no detected movement of a user's hand along surface 201 s that is contacting surface 201 s ), and/or the like.
- a threshold period of time e.g. 300 milliseconds, 500 milliseconds, etc.
- a reset event may be detected (e.g., at operation 306 ) and system 1 may attempt to detect a new user interface event (e.g., at operation 302 ).
- a reset event may be detected (e.g., at operation 306 ) and system 1 may attempt to detect a new user interface event (e.g., at operation 302 ).
- a user's relationship with system 1 changed from that of FIG. 1D (e.g., at which a first current user coordinate system UC may be defined as shown) to that of FIG. 1E and then the user did not move from the position of FIG.
- a reset event may be detected (e.g., at operation 306 ) and system 1 may attempt to detect a new user interface event (e.g., at operation 302 ), which may be immediately detected due to the user's current interaction with assembly 200 , such that a second current user coordinate system UC may then be defined to be the system UC of FIG. 1E (e.g., at operation 304 ).
- process 300 may advance to operation 308 , where the input assembly may determine whether any physical use of the input assembly has been detected. If no physical use of the input assembly is detected at operation 308 , then process 300 may return to operation 306 to determine if any reset interface event may be detected. However, if any physical use of the input assembly is detected at operation 308 , then process 300 may advance to operation 310 , where appropriate control data may be determined based on the physical use detected at operation 308 and based on the current user coordinate system defined at operation 304 , where such control data may be used to control the operation of any suitable electronic device (e.g., device 100 ), where such control data may be generated by the input assembly and/or by the electronic device.
- any suitable electronic device e.g., device 100
- Any suitable physical use may be detected at operation 308 , such as physical movement of input assembly 200 with respect to (“w/r/t”) or along work surface 5 (e.g., any movement of assembly 200 along any path along surface 5 or any rotation of the entirety of assembly 200 about axis IA), movement of a portion of user U (e.g., a finger tip or multiple finger tips) along surface 201 s of assembly 200 , tapping or force pressing by a portion of user U (e.g., a finger tip or multiple finger tips) downward into surface 201 s of assembly 200 , movement of a portion of input assembly 200 (e.g., rotation of dome structure 201 d with respect to base structure 201 b , axial shear force of dome structure 201 d with respect to base structure 201 b , etc.), and/or the like.
- w/r/t physical movement of input assembly 200 with respect to (“w/r/t”) or along work surface 5 (e.g., any movement of assembly 200 along any path along surface 5 or
- process 300 of FIG. 3 are only illustrative and that existing operations may be modified or omitted, additional operations may be added, and the order of certain operations may be altered.
- Application 203 of input assembly 200 and/or application 103 of electronic device 103 may be developed to include a rule system, such as a rule system that may be at least partially represented by media rule system Table 1 provided below, that may include various rules, where each rule may be associated with a particular control action and with a particular type of physical use defined with respect to a particular current user coordinate system.
- a rule system such as a rule system that may be at least partially represented by media rule system Table 1 provided below, that may include various rules, where each rule may be associated with a particular control action and with a particular type of physical use defined with respect to a particular current user coordinate system.
- each one of rules R1-R12 may be associated with or be defined to include a particular type of physical use (e.g., as may be detected at operation 308 ) defined with respect to a current user coordinate system UC (e.g., as may be defined at operation 304 ) and a particular control action that may be used to define particular control data (e.g., control data to be determined at operation 310 ), where such control data may be used by device 100 (e.g., by processor 102 ) to carry out the particular control action defined by the control data.
- a particular type of physical use e.g., as may be detected at operation 308
- UC e.g., as may be defined at operation 304
- a particular control action that may be used to define particular control data (e.g., control data to be determined at operation 310 ), where such control data may be used by device 100 (e.g., by processor 102 ) to carry out the particular control action defined by the control data.
- system 1 may be configured to map the X-axis of user coordinate system UC to the X-axis of device coordinate system DC and the Y-axis of user coordinate system UC to the Z-axis of device coordinate system DC for detected physical use that involves movement of input assembly 200 along work surface 5 .
- a user physical manipulation of input assembly 200 in the U+X direction (e.g., as determined at operation 308 ) of a determined current user coordinate system UC (e.g., as determined at operation 304 ) may be mapped by rule R1 (e.g., at operation 310 ) to a user interface manipulation of device 100 in the +X direction of device coordinate system DC (e.g., when user U physically slides input assembly 200 along surface 5 in the U+X direction of a determined current user coordinate system UC of any one of FIGS.
- control data may be determined that may be operative to instruct device 100 to correspondingly move cursor 112 c along screen 112 s in the +X direction of device coordinate system DC).
- a user physical manipulation of input assembly 200 in the U-Y direction (e.g., as determined at operation 308 ) of a determined current user coordinate system UC (e.g., as determined at operation 304 ) may be mapped by Rule R4 (e.g., at operation 310 ) to a user interface manipulation of device 100 in the ⁇ Z direction of device coordinate system DC (e.g., when user U physically slides input assembly 200 along surface 5 in the U-Y direction of a determined current user coordinate system UC of any one of FIGS.
- control data may be determined that may be operative to instruct device 100 to correspondingly move cursor 112 c along screen 112 s in the ⁇ Z direction of device coordinate system DC).
- one or more sensors or applications or processors or otherwise of assembly 200 may be operative to be recalibrated or mapped based on the current determined user coordinate system UC to detect user physical manipulation within that user coordinate system UC.
- This may enable different types of movement of assembly 200 with respect to input coordinate system NC to result in the same control data when those different types of movement are the result of the same type of movement with respect to a current user coordinate system UC. For example, this may enable not only the movement of assembly 200 of FIG.
- Any suitable sensor input component(s) 210 may be used to sense user physical manipulation of input assembly 200 along surface 5 , such as any suitable optical sensor(s), a track ball, or the like.
- a user physical manipulation of input assembly 200 may be to physically move a body part of user U with respect to (e.g., along exterior surface 201 s ).
- a body part of user U e.g., along exterior surface 201 s
- the current user coordinate system has been defined (e.g., at operation 304 ) to be user coordinate system UC of FIG. 1D
- user U physically rotates its right hand along exterior surface 201 s of assembly 200 in the direction of arrow CW by 45° about axis IA from the orientation of FIG. 1D to the orientation of FIG.
- each one of these three exemplary rotations of user U by 45° CW with respect to surface 201 s of assembly 200 may result in the same control data being determined by process 300 (e.g., control data that may be operative to rotate an object selected by cursor 112 c by 45° CW), despite each one of the three exemplary rotations having a differently defined current user coordinate system UC (e.g., UC of FIG. 1D , UC of FIG.
- system UC of FIG. 1D and system UC of FIG. 2D have the same relationship with system DC (e.g., X-axes aligned), but system UC of FIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC of FIG. 2D has with system NC (e.g., X- and Y-axes not aligned).
- system NC e.g., X- and Y-axes not aligned.
- system NC e.g., X- and Y-axes not aligned
- system DC e.g., X-axes aligned
- system UC of FIG. 2F has with system DC (e.g., X- and Y-axes not aligned)
- system NC e.g., X- and Y-axes aligned
- system NC e.g., Y-axis of UC aligned with X-axis of NC
- system NC of FIG. 1D has a different relationship with system DC (e.g., X- and Y-axes aligned) than system NC of FIG. 2F has with system DC (e.g., X- and Y-axes not aligned).
- user U may physically rotate its right hand along exterior surface 201 s in the direction of arrow CW by 45° about axis IA from the orientation of FIG. 1D to the orientation of FIG. 1E , where the current user orientation of axis RMA of user U may rotate with respect to assembly 200 (e.g., axis RMA may rotate from being aligned with feet F and B at FIG. 1D to being offset between feet F and R of FIG.
- system 1E e.g., system UC may rotate with respect to system NC in the direction of arrow CW by 45°
- system 1 e.g., one or more motion sensor input components 210 or touch sensor (e.g., multi-touch) sensor input components 210 of assembly 200
- system 1 may be operative to detect the movement of user U along exterior surface 201 s with respect to the initial current determined user coordinate system UC of FIG. 1D to detect the 45° CW rotation and may be operative to generate particular command data 99 for controlling an interface of device 100 in a particular manner (e.g., rotate an object selected by cursor 112 c by 45° CW) (e.g., the control action of Rule 5).
- user U may physically rotate its right hand along exterior surface 201 s in the direction of arrow CW by 45° about axis IA from the orientation of FIG. 2F to the orientation of FIG. 2D , where the current user orientation of axis RMA of user U may rotate with respect to assembly 200 (e.g., axis RMA may rotate from being aligned with feet L and R at FIG. 2F to being offset between feet L and F of FIG.
- system 1 e.g., one or more motion sensor input components 210 or touch sensor (e.g., multi-touch) sensor input components 210 of assembly 200
- system 1 may be operative to detect the movement of user U along exterior surface 201 s with respect to the initial current determined user coordinate system UC of FIG.
- 2F to detect the 45° CW rotation and may be operative to generate particular command data 99 for controlling an interface of device 100 in a particular manner (e.g., rotate an object selected by cursor 112 c by 45° CW) (e.g., the control action of Rule 5 ), which may be the same as the particular manner based on user physical manipulation between FIGS. 1D and 1E , as the same rotation of user U with respect to exterior surface 201 s occurred despite the orientation of user U to device 100 not being the same between the two physical manipulations.
- the initial current determined user coordinate system UC e.g., of FIG. 1D or of FIG.
- 2F may be used for providing context to the entire physical manipulation (e.g., rotation) of user U with respect to exterior surface 201 s (e.g., rather than updating the determined user coordinate system UC). Therefore, in some embodiments, once an initial current determined user coordinate system UC may be determined by system 1 , that same determined user coordinate system UC may be used for providing context to any detected user physical manipulation, as long as one or more rules are followed (e.g., as long as user U does not completely break contact with assembly 200 during the manipulation or otherwise interact with assembly 200 in a manner that may cause system 1 to attempt to reset the current determined user coordinate system UC and determine a new user coordinate system UC (e.g., provide no movement for more than a particular threshold of time)).
- a new user coordinate system UC e.g., provide no movement for more than a particular threshold of time
- a user physical manipulation of input assembly 200 may be to physically move a body part of user U with respect to (e.g., along exterior surface 201 s ) in a different manner (e.g., just a portion of a finger rather than all fingers and palm).
- a body part of user U e.g., along exterior surface 201 s
- a different manner e.g., just a portion of a finger rather than all fingers and palm.
- the current user coordinate system has been defined (e.g., at operation 304 ) to be user coordinate system UC of FIG. 1D
- user U physically flicks its middle finger along exterior surface 201 s of assembly 200 in the +Y direction (e.g., in U+Y direction of system UC of FIG.
- each one of these three exemplary flicks of a finger of user U along surface 201 s of assembly 200 in the +Y direction of the current user coordinate system UC may result in the same control data being determined by process 300 (e.g., control data that may be operative to make cursor 112 c bigger (e.g., control data based on action of Rule R7)), despite each one of the three exemplary finger flicks having a differently defined current user coordinate system UC (e.g., UC of FIG.
- system UC of FIG. 1D and system UC of FIG. 2D have the same relationship with system DC (e.g., X-axes aligned), but system UC of FIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC of FIG. 2D has with system NC (e.g., X- and Y-axes not aligned).
- system UC of FIG. 1D has a different relationship with system DC (e.g., X-axes aligned) than system UC of FIG. 2F has with system DC (e.g., X- and Y-axes not aligned), and system UC of FIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC of FIG. 2F has with system NC (e.g., Y-axis of UC aligned with X-axis of NC), and system NC of FIG. 1D has a different relationship with system DC (e.g., X- and Y-axes aligned) than system NC of FIG. 2F has with system DC (e.g., X- and Y-axes not aligned).
- system DC e.g., X-axes aligned
- system NC e.g., X- and Y-axes aligned
- a user physical manipulation of input assembly 200 may be to physically rotate housing 201 of assembly 200 about axis IA on surface 5 .
- user U may physically rotate assembly 200 in the direction of arrow CW by 45° about axis IA on surface 5 from the orientation of FIG. 1D to the orientation of FIG. 2E , where the current user orientation of axis RMA of user U may remain the same with respect to assembly 200 (e.g., axis RMA may remain aligned with feet F and B between the orientation of FIG. 1D and the orientation of FIG. 2E (e.g., the orientation of system UC to system NC may remain constant during rotation of assembly 200 by user U from FIG. 1D to FIG.
- yet system 1 e.g., one or more motion sensor input components 210 of assembly 200
- yet system 1 may be operative to detect the rotation of assembly 200 and axis RMA of current determined user coordinate system UC about axis IA, and such detected 45° CW rotation may be operative to generate particular command data 99 for controlling an interface of device 100 in a particular manner (e.g., control data based on the action of Rule R9 (e.g., make cursor 112 c brighter)).
- Rule R9 e.g., make cursor 112 c brighter
- user U may physically rotate assembly 200 in the direction of arrow CW by 45° about axis IA on surface 5 from the orientation of FIG. 1F to the orientation of FIG.
- system 1 e.g., one or more motion sensor input components 210 of assembly 200
- system 1 may be operative to detect the rotation of assembly 200 and axis RMA of current determined user coordinate system UC about axis IA, and such detected 45° CW rotation may be operative to generate particular command data 99 for controlling an interface of device 100 in a particular manner (e.g., control data based on the action of Rule R 9 (e.g., make cursor 112 c brighter)), which may be the same as the particular manner based on user physical manipulation between FIGS.
- Rule R 9 e.g., make cursor 112 c brighter
- a user physical manipulation of input assembly 200 may be to physically move dome structure 201 d with respect to base structure 201 b .
- a user physical manipulation of assembly 200 may move dome structure 201 d in the U+X direction of a current determined user coordinate system UC with respect to base structure 201 b from the position of FIG. 1G to the position of FIG. 1H by a distance D.
- Any suitable sensor input component(s) 210 of assembly 200 may be configured to detect such user physical manipulation in the context of current determined user coordinate system UC, and such user physical manipulation in the U+X direction of a current determined user coordinate system UC may be mapped to a user interface manipulation of device 100 in the +X direction of device coordinate system DC (e.g., when user U physically moves structure 201 d with respect to structure 201 b by distance D in the U+X direction of a determined current user coordinate system UC of any one of FIGS.
- any suitable shear force sensor(s) 210 s e.g., with haptic feedback
- system 1 e.g., assembly 200 and/or device 100
- system 1 may be configured to generate and communicate control data 99 to device 100 that may be operative to instruct device 100 to correspondingly bounce (or otherwise manipulate) cursor 112 c along screen 112 s in the +X direction of device coordinate system DC by a distance proportional to distance D) (e.g., based on the action of Rule 11 ).
- Movement of structure 201 d with respect to structure 201 b may be enabled in any suitable directions (e.g., 2, 4, 8, 16, 32, or more directions with an X-Y plane of a user coordinate system UC), such that assembly 200 may be manipulated like an analog joystick controller.
- suitable directions e.g., 2, 4, 8, 16, 32, or more directions with an X-Y plane of a user coordinate system UC
- assembly 200 may be manipulated like an analog joystick controller.
- Any suitable user physical interactions with respect to (e.g., any suitable physical use of) assembly 200 may be detected by system 1 (e.g., at operation 308 ) for controlling an interface of device 100 according to the concepts of this disclosure (e.g., for mapping (e.g., at operation 310 ) a user physical manipulation as detected (e.g., at operation 308 ) with respect to a determined user coordinate system UC (e.g., as defined at operation at 304 ) to a particular interface manipulation with respect to device coordinate system DC).
- system 1 e.g., at operation 308
- mapping e.g., at operation 310
- a user physical manipulation as detected e.g., at operation 308
- UC e.g., as defined at operation at 304
- any suitable multi-touch sensor input component(s) 210 may be provided along dome exterior surface 201 s to detect any suitable touch gestures by user U along exterior surface 201 s (e.g., pinch to zoom between a thumb and index finger, full hand rotation (as mentioned above), scroll wheel by a single finger flicking motion (e.g., using a physical encoder or otherwise), scroll wheel by a single finger circular path motion (e.g., a circular dome shaped surface 201 s may be more conducive to facilitating a circular finger motion than a flat surface), single or multi-finger clicks (e.g., each finger may be tapped on surface 201 s and detected as that particular finger such that system 1 may be operative to associate with different finger clicks with different user control commands for device 100 ), two or three or four finger gestures (e.g., clicks or relative movement on surface 201 s ).
- any suitable touch gestures by user U along exterior surface 201 s e.g., pinch to zoom between a thumb and index
- Various touch sensor technologies may be used with a curved exterior surface 201 s , such as capacitive touch sensor technologies (e.g., carbon nanobud, metal wire, metal mesh, conductive fabric, flexible circuitry (e.g., polyethylene terephthalate (“PET”), polyethylene naphthalate (“PEN”), polyimide (“PI”), etc.)), optical touch sensor technologies (e.g., frustrated total internal reflection (“FTIR”) multi-touch technology, etc.), ultrasonic touch sensor technologies, and/or the like.
- capacitive touch sensor technologies e.g., carbon nanobud, metal wire, metal mesh, conductive fabric, flexible circuitry (e.g., polyethylene terephthalate (“PET”), polyethylene naphthalate (“PEN”), polyimide (“PI”), etc.)
- optical touch sensor technologies e.g., frustrated total internal reflection (“FTIR”) multi-touch technology, etc.
- FTIR frustrated total internal reflection
- Any suitable touch sensing may also be enabled to detect force (e.g., a magnitude of pressure or force exerted by a user at each touch event), such as in vertical, horizontal, and/or rotational axes with respect to surface 201 s .
- any suitable optical sensor and/or inertial measurement unit (“IMU”) sensor input component(s) 210 of assembly 200 may be operative to detect physical rotation of dome structure 201 d with respect to base structure 201 b (e.g., about axis IA), which may be operative to enable assembly 200 to be physically manipulated for use as a scroll wheel (e.g., physical encoder or otherwise).
- Any suitable haptic and/or audible and/or visual feedback may be provided by any suitable output component(s) 212 of assembly 200 to help user U confidently interact with system 1 .
- such an assembly 200 with dome housing structure 201 d may provide not only an ambidextrous design that may be similarly used by either the left or right hand of a user, but also an orientationless design (e.g., about axis IA) that may be similarly used by any hand at any user orientation with respect to any component(s) of assembly 200 (e.g., at any orientation of system UC of any hand with respect to system NC (e.g., any one of 3 or more (e.g., 360) such orientations)), while providing consistent and expected device control of device 100 .
- an orientationless design e.g., about axis IA
- any component(s) of assembly 200 e.g., at any orientation of system UC of any hand with respect to system NC (e.g., any one of 3 or more (e.g., 360) such orientations
- NC e.g., any one of 3 or more (e.g., 360) such orientations
- a current user orientation e.g., out of three or more possible orientations (e.g., 360 orientations for 360° rotation of user U's hand about axis IA)
- assembly 200 e.g., by sensing the position of one or more body parts of a user (e.g., the position of one or more types of digits relative to one another) using heat sensing, touch sensing, comparisons to known user orientations, etc.) for defining a user coordinate system UC with respect to input coordinate system NC in which one or more user physical manipulations of assembly 200 by user U may then be detected
- a particular user manipulation within user coordinate system UC may consistently control device 100 in the same manner despite that user coordinate system UC being able to have multiple orientations with respect to input coordinate system NC of assembly 200 and/or with respect to device coordinate system DC of device 100 .
- assembly 200 may have any suitable size and/or shape, such as a flat rectangle, and may not necessarily be domed and/or orientationless with respect to one or more axes.
- any particular physical manipulation of assembly 200 with respect to that defined user coordinate system UC may result in the same user control of device 100 (e.g., the same manipulation of cursor 112 c ).
- any aspects of the disclosure may each be implemented by software, but may also be implemented in hardware, firmware, or any combination of software, hardware, and firmware. They each may also be embodied as computer-readable code recorded on a computer-readable medium.
- the computer-readable medium may be any data storage device that can store data or instructions which can thereafter be read by a computer system. Examples of the computer-readable medium may include, but are not limited to, read-only memory, random-access memory, flash memory, CD-ROMs, DVDs, magnetic tape, and optical data storage devices (e.g., memory 104 and/or memory 204 of FIG. 1 ).
- the computer-readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- the computer-readable medium may be communicated from one electronic device to another electronic device using any suitable communications protocol (e.g., the computer-readable medium may be communicated to electronic device 100 via communications component 106 and/or assembly 200 via communications component 206 ).
- the computer-readable medium may embody computer-readable code, instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media.
- a modulated data signal may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- each process may be enabled by any suitable software construct, firmware construct, one or more hardware components, or a combination thereof.
- each process may be described in the general context of computer-executable instructions, such as program modules, that may be executed by one or more computers or other devices.
- a program module may include one or more routines, programs, objects, components, and/or data structures that may perform one or more particular tasks or that may implement one or more particular abstract data types.
- the number, configuration, functionality, and interconnection of program modules of system 1 may be of any suitable architecture.
Abstract
Description
- This application claims the benefit of prior filed U.S. Provisional Patent Application No. 62/398,939, filed Sep. 23, 2016, which is hereby incorporated by reference herein in its entirety.
- This generally relates to an input assembly for controlling an electronic device and, more particularly, to a domed orientationless and/or ambidextrous input assembly for controlling an electronic device.
- Some systems may include an electronic device with a display assembly operative to present a graphical user interface, as well as an electronic input assembly that may be manipulated by a user for generating user control signals operative to adjust the graphical user interface. However, existing systems often limit the ways by which a user may interact with an input assembly to generate particular user control signals.
- Domed input assemblies for controlling an electronic device and methods for using domed input assemblies for controlling an electronic device are provided.
- As an example, an input assembly is provided for controlling an electronic device, where the input assembly may include a housing structure providing an orientationless surface with respect to at least one axis of an input coordinate system of the input assembly, a sensor subassembly at least partially protected by the housing structure, and processor operative to detect, with the sensor subassembly, a user coordinate system of a user with respect to the orientationless surface, detect, with the sensor subassembly, a physical use of the housing structure, and determine a control action for the electronic device based on the user coordinate system and the physical use.
- As another example, a method is provided for controlling an electronic device using an input assembly of a user, where the method may include detecting a user interface event that includes the user interfacing with the input assembly, after the detecting the user interface event, defining a user coordinate system for the detected user interface event from a plurality of possible user coordinate systems, after the defining the user coordinate system, detecting a physical use of the input assembly, and, based on both the defined user coordinate system and the detected physical use, determining control data that is operative to control the electronic device.
- As yet another example, an input assembly is provided for controlling an electronic device, where the input assembly may include a housing structure providing a surface within an input coordinate system of the input assembly, a sensor subassembly at least partially protected by the housing structure, and a processor operative to detect, with the sensor subassembly, a user coordinate system of a user with respect to the surface from at least three possible user coordinate systems, wherein each one of the at least three possible user coordinate systems has a different orientation with respect to the input coordinate system of the input assembly, detect, with the sensor subassembly, a physical use of the housing structure, and determine a control action for the electronic device based on the user coordinate system and the physical use.
- This Summary is provided only to summarize some example embodiments, so as to provide a basic understanding of some aspects of the subject matter described in this document. Accordingly, it will be appreciated that the features described in this Summary are only examples and should not be construed to narrow the scope or spirit of the subject matter described herein in any way. Unless otherwise stated, features described in the context of one example may be combined or used with features described in the context of one or more other examples. Other features, aspects, and advantages of the subject matter described herein will become apparent from the following Detailed Description, Figures, and Claims.
- The discussion below makes reference to the following drawings, in which like reference characters refer to like parts throughout, and in which:
-
FIG. 1 is a schematic view of an illustrative user input system including an electronic device and an electronic input assembly; -
FIG. 1A is a left side view of an exemplary input assembly interacting in a first assembly orientation with an exemplary electronic device of the system ofFIG. 1 ; -
FIG. 1B is a front side view of the system ofFIGS. 1 and 1A with the input assembly interacting in the first assembly orientation with the electronic device; -
FIG. 1C is a top view of the system ofFIGS. 1-1B with the input assembly interacting in the first assembly orientation with the electronic device; -
FIG. 1D is a top view of the system ofFIGS. 1-1C with the input assembly interacting in the first assembly orientation with the electronic device and with a user interacting in a first user orientation with the input assembly; -
FIG. 1E is a top view of the system ofFIGS. 1-1D with the input assembly interacting in the first assembly orientation with the electronic device and with the user interacting in a second user orientation with the input assembly; -
FIG. 1F is a top view of the system ofFIGS. 1-1E with the input assembly interacting in the first assembly orientation with the electronic device and with the user interacting in a third user orientation with the input assembly; -
FIG. 1G is a front side view of the system ofFIGS. 1-1F taken from line 1G-1G ofFIG. 1D but without the user shown; -
FIG. 1H is a front side view of the system ofFIGS. 1-1G when the input assembly has been physically manipulated by the user; -
FIG. 2A is a left side view of the system ofFIGS. 1-1H with the input assembly interacting in a second assembly orientation with the electronic device; -
FIG. 2B is a front side view of the system ofFIGS. 1-2A with the input assembly interacting in the second assembly orientation with the electronic device; -
FIG. 2C is a top view of the system ofFIGS. 1-2B with the input assembly interacting in the second assembly orientation with the electronic device; -
FIG. 2D is a top view of the system ofFIGS. 1-2C with the input assembly interacting in the second assembly orientation with the electronic device and with the user interacting in a fourth user orientation with the input assembly; -
FIG. 2E is a top view of the system ofFIGS. 1-2D with the input assembly interacting in the second assembly orientation with the electronic device and with the user interacting in a fifth user orientation with the input assembly; -
FIG. 2F is a top view of the system ofFIGS. 1-2E with the input assembly interacting in the second assembly orientation with the electronic device and with the user interacting in a sixth user orientation with the input assembly; and -
FIG. 3 is a flowchart of an illustrative process for using an input assembly to control an electronic device. - In the following detailed description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of the various embodiments described herein. Those of ordinary skill in the art will realize that these various embodiments are illustrative only and are not intended to be limiting in any way. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure.
- In addition, for clarity purposes, not all of the routine features of the embodiments described herein are shown or described. One of ordinary skill in the art will readily appreciate that in the development of any such actual embodiment, numerous embodiment-specific decisions may be required to achieve specific design objectives. These design objectives will vary from one embodiment to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine engineering undertaking for those of ordinary skill in the art having the benefit of this disclosure.
- The present disclosure relates to input assemblies (e.g., domed orientationless and/or ambidextrous input assemblies) for controlling an electronic device and methods for using input assemblies for controlling an electronic device. Determination of a current user orientation with respect to an input assembly (e.g., determination of a current orientation of a user coordinate system of a user with respect to an input coordinate system of an input assembly) may be used to map particular user physical manipulations of the input assembly to particular types of control data for controlling an electronic device (e.g., for controlling a cursor on a screen of the electronic device). This may enable consistent control data to be generated in response to a particular user physical gesture imparted by a user on an input assembly no matter the orientation of the user to input assembly and/or the orientation of the input assembly to the electronic device and/or the orientation of the user to the electronic device.
- Domed input assemblies for controlling an electronic device and methods for using domed input assemblies for controlling an electronic device are provided and described with reference to
FIGS. 1-3 . -
FIG. 1 is a schematic view of anillustrative system 1 with anelectronic device 100 and anelectronic input assembly 200. Input assembly 200 (e.g., a mouse, a trackpad, user-manipulated electronic input device, hand-held input device, and the like) may be configured to provide input to electronic device 100 (e.g., a tablet computer, laptop computer, desktop computer, and the like), such as for controlling a user interface presented by the electronic device. A system user (e.g., user U ofFIGS. 1D-1F ) may slide, rotate, squeeze, press, touch, or otherwise physically manipulateinput assembly 200 in any suitable manner relative toelectronic device 100 to convey information toelectronic device 100 for controllingelectronic device 100 in any suitable manner, such as for controlling or otherwise interacting with a user interface presented byelectronic device 100. In many embodiments, the user interface presented byelectronic device 100 may be a graphical or otherwise visual user interface that may be presented by a display output component ofelectronic device 100. However, in other embodiments, the user interface presented byelectronic device 100 may be an at least partially non-visual user interface that may instead provide audible and/or tactile information to the user. Collectively,input assembly 200 andelectronic device 100 may be referred to herein as a “user input”system 1. - Broadly and generally,
system 1 may be operative to determine and/or estimate a current user orientation of at least one body part of user U (e.g., a palm, wrist, set of fingers, etc.) with respect to at least one physical feature ofinput assembly 200 as well as to detect a current physical manipulation ofinput assembly 200 byuser U. System 1 may also be operative to use both the determined user orientation and the detected physical manipulation to define particular control command for controlling electronic device 100 (e.g., for controlling a user interface presented by electronic device 100).System 1 may be operative to determine three or more particular user orientations of user U with respect toinput assembly 200, such as to determine that a current user orientation is a particular one of at least three possible user orientations that may be detectable bysystem 1. In some embodiments,system 1 may be configured to detect any greater number of possible user orientations, such as 30, 60, 90, 120, 180, 240, 270, 360, 480, 540, 720, and/or any other suitable possible user orientations with respect to input assembly (e.g., any degree orientation of 360 degree orientations of user U's left hand with respect toinput assembly 200 and any degree orientation of 360 degree orientations of user U's right hand with respect to input assembly 200). Moreover,system 1 may be operative to detect any suitable number of types of physical manipulation by user U ofinput assembly 200, such as any planar movement of the entirety ofinput assembly 200 in a plane (e.g., along a work surface), any planar movement of a portion ofinput assembly 200 with respect to another portion ofinput assembly 200, any rotational movement of a portion ofinput assembly 200 with respect to another portion ofinput assembly 200, any touch gestures along a surface ofinput assembly 200, and/or the like. A shape of at least an external portion ofinput assembly 200 may be a dome or other suitable shape that may be operative to present a similar external surface to user U no matter which of three or more user orientations user U may have with respect to that external surface, such that user U may more easily orient itself with respect toinput assembly 200 no matter the position of user U with respect to input assembly 200 (e.g., as compared to an input assembly shaped to provide only one or two comfortable orientations with respect to the user). Therefore,system 1 may be configured to enable user U to controlelectronic device 100 in a similar fashion using the same physical manipulation ofinput assembly 200 no matter which one of three or more user orientations user U may have with respect toinput assembly 200. -
Electronic device 100 may be any portable, mobile, or hand-held electronic device configured to receive control signals frominput assembly 200 for controlling a user interface ofelectronic device 100. Alternatively,electronic device 100 may not be portable at all, but may instead be generally stationary.Electronic device 100 can include, but is not limited to, a media player, video player, still image player, game player, other media player, music recorder, movie or video camera or recorder, still camera, other media recorder, radio, medical equipment, domestic appliance, transportation vehicle instrument, musical instrument, calculator, cellular telephone (e.g., an iPhone™ available by Apple Inc.), other wireless communication device, personal digital assistant, remote control, pager, computer (e.g., a desktop, laptop, tablet, server, etc.), monitor, television, stereo equipment, set up box, set-top box, wearable device (e.g., an Apple Watch™by Apple Inc.), boom box, modem, router, printer, and combinations thereof.Electronic device 100 may include any suitable control circuitry orprocessor 102,memory 104, communications component 106,power supply 108,input component 110, andoutput component 112.Electronic device 100 may also include abus 114 that may provide one or more wired or wireless communication links or paths for transferring data and/or power to, from, or between various other components ofdevice 100.Device 100 may also be provided with ahousing 101 that may at least partially enclose one or more of the components ofdevice 100 for protection from debris and other degrading forces external todevice 100. In some embodiments, one or more of the components may be provided within its own housing (e.g.,output component 112 may be an independent display within its own housing that may wirelessly or through a wire communicate withprocessor 102, which may be provided within its own housing). In some embodiments, one or more components ofelectronic device 100 may be combined or omitted. Moreover,electronic device 100 may include other components not combined or included inFIG. 1 . For example,device 100 may include any other suitable components or several instances of the components shown inFIG. 1 . For the sake of simplicity, only one of each of the components is shown inFIG. 1 . -
Memory 104 may include one or more storage mediums, including for example, a hard-drive, flash memory, permanent memory such as read-only memory (“ROM”), semi-permanent memory such as random access memory (“RAM”), any other suitable type of storage component, or any combination thereof.Memory 104 may include cache memory, which may be one or more different types of memory used for temporarily storing data for electronic device applications.Memory 104 may store media data (e.g., music and image files), software (e.g., applications for implementing functions on device 100 (e.g., virtual drawing space or other user interface applications)), firmware, preference information (e.g., media playback preferences), lifestyle information (e.g., food preferences), exercise information (e.g., information obtained by exercise monitoring equipment), transaction information (e.g., information such as credit card information), wireless connection information (e.g., information that may enabledevice 100 to establish a wireless connection), subscription information (e.g., information that keeps track of podcasts or television shows or other media a user subscribes to), contact information (e.g., telephone numbers and e-mail addresses), calendar information, any other suitable data, or any combination thereof. - Communications component 106 may be provided to allow
device 100 to communicate with one or more other electronic devices or servers or subsystems (e.g., input assembly 200) using any suitable communications protocol(s). For example, communications component 106 may support Wi-Fi (e.g., an 802.11 protocol), Ethernet, Bluetooth™, near field communication (“NFC”), radio-frequency identification (“RFID”), high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communication systems), infrared, transmission control protocol/internet protocol (“TCP/IP”) (e.g., any of the protocols used in each of the TCP/IP layers), hypertext transfer protocol (“HTTP”), BitTorrent™, file transfer protocol (“FTP”), real-time transport protocol (“RTP”), real-time streaming protocol (“RTSP”), secure shell protocol (“SSH”), any other communications protocol, or any combination thereof. Communications component 106 may also include circuitry that can enabledevice 100 to be electrically coupled to another device or server or subsystem (e.g., input assembly 200) and communicate with that other device, either wirelessly or via a wired connection. -
Power supply 108 may provide power to one or more of the components ofdevice 100. In some embodiments,power supply 108 can be coupled to a power grid (e.g., whendevice 100 is not a portable device, such as a desktop computer). In some embodiments,power supply 108 can include one or more batteries for providing power (e.g., whendevice 100 is a portable device, such as a cellular telephone). As another example,power supply 108 can be configured to generate power from a natural source (e.g., solar power using solar cells). - One or
more input components 110 may be provided to permit a user to interact or interface withdevice 100 and/or to sense certain information about the ambient environment. For example, input component 110 can take a variety of forms, including, but not limited to, a touch pad, trackpad, dial, click wheel, scroll wheel, touch screen, one or more buttons (e.g., a keyboard), mouse, joy stick, track ball, switch, photocell, force-sensing resistor (“FSR”), encoder (e.g., rotary encoder and/or shaft encoder that may convert an angular position or motion of a shaft or axle to an analog or digital code), microphone, camera, scanner (e.g., a barcode scanner or any other suitable scanner that may obtain product identifying information from a code, such as a linear barcode, a matrix barcode (e.g., a quick response (“QR”) code), or the like), proximity sensor (e.g., capacitive proximity sensor), biometric sensor (e.g., a fingerprint reader or other feature recognition sensor, which may operate in conjunction with a feature-processing application that may be accessible to electronic device 100 for authenticating or otherwise identifying or detecting a user), line-in connector for data and/or power, force sensor (e.g., any suitable capacitive sensors, pressure sensors, strain gauges, sensing plates (e.g., capacitive and/or strain sensing plates), etc.), temperature sensor (e.g., thermistor, thermocouple, thermometer, silicon bandgap temperature sensor, bimetal sensor, etc.) for detecting the temperature of a portion of electronic device 100 or an ambient environment thereof, a performance analyzer for detecting an application characteristic related to the current operation of one or more components of electronic device 100 (e.g., processor 102), motion sensor (e.g., single axis or multi axis accelerometers, angular rate or inertial sensors (e.g., optical gyroscopes, vibrating gyroscopes, gas rate gyroscopes, or ring gyroscopes), linear velocity sensors, and/or the like), magnetometer (e.g., scalar or vector magnetometer), pressure sensor, light sensor (e.g., ambient light sensor (“ALS”), infrared (“IR”) sensor, etc.), touch sensor, hover (e.g., finger hover or near touch) sensor (e.g., one or more ultrasonic transducers or receivers and/or far field capacitive sensing and/or the like), thermal sensor, acoustic sensor, sonic or sonar sensor, radar sensor, image sensor, video sensor, global positioning system (“GPS”) detector, radio frequency (“RF”) detector, RF or acoustic Doppler detector, RF triangulation detector, electrical charge sensor, peripheral device detector, event counter, and any combinations thereof. Eachinput component 110 can be configured to provide one or more dedicated control functions for making selections or issuing commands associated with operatingdevice 100. -
Electronic device 100 may also include one ormore output components 112 that may present information (e.g., graphical, audible, and/or tactile information) to a user ofdevice 100. An output component ofelectronic device 100 may take various forms, including, but not limited to, audio speakers, headphones, data and/or power line-outs, visual displays (e.g., for transmitting data via visible light and/or via invisible light), antennas, infrared ports, flashes (e.g., light sources for providing artificial light for illuminating an environment of the device), tactile/haptic outputs (e.g., rumblers, vibrators, etc.), taptic components (e.g., components that are operative to provide tactile sensations in the form of vibrations), and any combinations thereof. - For example,
electronic device 100 may include a display asoutput component 112.Display 112 may include any suitable type of display or interface for presenting visual data to a user. In some embodiments,display 112 may include a display embedded indevice 100 or coupled to device 100 (e.g., a removable display).Display 112 may include, for example, a liquid crystal display (“LCD”), a light emitting diode (“LED”) display, an organic light-emitting diode (“OLED”) display, a surface-conduction electron-emitter display (“SED”), a carbon nanotube display, a nanocrystal display, an organic electroluminescence display, electronic ink, or another type of display technology or combination of display technology types. Alternatively, display 112 can include a movable display or a projecting system for providing a display of content on a surface remote fromelectronic device 100, such as, for example, a video projector, a head-up display, or a three-dimensional (e.g., holographic) display. As another example,display 112 may include a digital or mechanical viewfinder, such as a viewfinder of the type found in compact digital cameras, reflex cameras, or any other suitable still or video camera. In some embodiments,display 112 may include display driver circuitry, circuitry for driving display drivers, or both.Display 112 can be operative to display content (e.g., media playback information, application screens for applications implemented onelectronic device 100, information regarding ongoing communications operations, information regarding incoming communications requests, device operation screens, etc.) that may be under the direction ofprocessor 102.Display 112 can be associated with any suitable characteristic dimensions defining the size and shape of the display. For example, the display can be rectangular or have any other polygonal shape, or alternatively can be defined by a curved or other non-polygonal shape (e.g., a circular display).Display 112 can have one or more primary orientations for which an interface can be displayed, or can instead or in addition be operative to display an interface along any orientation selected by a user. - It should be noted that one or
more input components 110 and one ormore output components 112 may sometimes be referred to collectively herein as an input/output (“I/O”) component or I/O interface 111 (e.g.,input component 110 and display 112 as I/O component or I/O interface 111). For example,input component 110 anddisplay 112 may sometimes be a single I/O component 111, such as a touch screen, that may receive input information through a user's and/or stylus' touch of a display screen and that may also provide visual information to a user via that same display screen.Input component 110 ofelectronic device 100 may provide an input surface relative to which a system user may manipulate the orientation and position of stylus 400 to convey information toelectronic device 100. In many embodiments, such an input surface ofinput component 110 ofelectronic device 100 may be provided as a portion of a multi-touch display screen assembly (e.g., as a portion of I/O interface 111 with a display output component 112). However, in other embodiments, such an input surface ofinput component 110 ofelectronic device 100 may be a non-display input surface, such as, but not limited to, a trackpad or drawing tablet, whether or notdevice 100 may also include a display output component. -
Processor 102 ofdevice 100 may include any processing circuitry operative to control the operations and performance of one or more components ofelectronic device 100. For example,processor 102 may be used to run one or more applications, such as anapplication 103.Application 103 may include, but is not limited to, one or more operating system applications, firmware applications, user interface applications, media playback applications, media editing applications, pass applications, calendar applications, state determination applications (e.g., device state determination applications, input assembly state determination applications, etc.), biometric feature-processing applications, compass applications, health applications, thermometer applications, weather applications, thermal management applications, force sensing applications, device diagnostic applications, video game applications, or any other suitable applications. For example,processor 102 may loadapplication 103 as a user interface program or any other suitable program to determine how instructions or data received via aninput component 110 and/or any other component of device 100 (e.g., input assembly data frominput assembly 200 via communications component 106, etc.) may manipulate the one or more ways in which information may be stored on device 100 (e.g., in memory 104) and/or provided to a user via anoutput component 112 and/or to a remote subsystem (e.g., to input assembly 200 via communications component 106).Application 103 may be accessed byprocessor 102 from any suitable source, such as from memory 104 (e.g., via bus 114) or from another device or server (e.g., frominput assembly 200 via communications component 106, and/or from any other suitable remote source via communications component 106). Electronic device 100 (e.g.,processor 102,memory 104, or any other components available to device 100) may be configured to process graphical data at various resolutions, frequencies, intensities, and various other characteristics as may be appropriate for the capabilities and resources ofdevice 100.Processor 102 may include a single processor or multiple processors. For example,processor 102 may include at least one “general purpose” microprocessor, a combination of general and special purpose microprocessors, instruction set processors, graphics processors, video processors, and/or related chips sets, and/or special purpose microprocessors.Processor 102 also may include on board memory for caching purposes.Processor 102 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example,process 102 can be a microprocessor, a central processing unit, an application-specific integrated circuit, a field-programmable gate array, a digital signal processor, an analog circuit, a digital circuit, or combination of such devices.Processor 102 may be a single-thread or multi-thread processor.Processor 102 may be a single-core or multi-core processor. Accordingly, as described herein, the term “processor” may refer to a hardware-implemented data processing device or circuit physically structured to execute specific transformations of data including data operations represented as code and/or instructions included in a program that can be stored within and accessed from a memory. The term is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, analog or digital circuits, or other suitably configured computing element or combination of elements. -
Input assembly 200 may be any suitable electronic user input tool, mouse, trackpad, user-manipulated electronic input device, hand-held input device, and/or the like that may be configured to provide control signals or other input toelectronic device 100.Input assembly 200 may include any suitable control circuitry orprocessor 202, which may be similar to anysuitable processor 102 ofdevice 100,application 203, which may be similar to anysuitable application 103 ofdevice 100,memory 204, which may be similar to anysuitable memory 104 ofdevice 100,communications component 206, which may be similar to any suitable communications component 106 ofdevice 100,power supply 208, which may be similar to anysuitable power supply 108 ofdevice 100,input component 210, which may be similar to anysuitable input component 110 ofdevice 100,output component 212, which may be similar to anysuitable output component 112 ofdevice 100, I/O interface 211, which may be similar to any suitable I/O interface 111 ofdevice 100,bus 214, which may be similar to anysuitable bus 114 ofdevice 100, and/orhousing 201, which may be similar to anysuitable housing 101 ofdevice 100. In some embodiments, one or more components ofinput assembly 200 may be combined or omitted. Moreover,input assembly 200 may include other components not combined or included inFIG. 1 . For example,input assembly 200 may include any other suitable components or several instances of the components shown inFIG. 1 . For the sake of simplicity, only one of each of the components is shown inFIG. 1 .Input assembly 200 andelectronic device 100 may be operative to communicate any suitable data 99 (e.g., control data signals) betweencommunication components 206 and 106 using any suitable communication protocol(s). -
FIGS. 1A-2F show input assembly 200 positioned onwork surface 5.Input assembly 200 may be provided with adome housing structure 201 d positioned above abase housing structure 201 b, while various components of input assembly 200 (e.g.,process 202,memory 204,communications component 206, one ormore input components 210, one ormore output components 212, and/or the like may be at least partially positioned in a space betweendome structure 201 d andbase structure 201 b.Input assembly 200 may include one or more feet or pad elements extending downwardly frombase structure 201 b for contactingwork surface 5 and/or for supporting the remainder ofinput assembly 200 thereabove. For example, as shown,input assembly 200 may include four feet F, R, B, and L that may be offset from one another and extending downwardly from different portions ofbase structure 201 b for providing balanced support forinput assembly 200 onwork surface 5. -
Dome structure 201 d may be any suitable shape, such as a spherical dome (e.g., a portion of a sphere cut off by a plane), a spheroidal dome (e.g., a section of a spheroid that provides a dome that is circularly symmetric (e.g., with an axis IA of rotation)), an ellipsoidal dome (e.g., a portion of an ellipsoid cut off by a plane normal to a symmetry axis IA of the ellipsoid), and/or the like, such that the shape ofexterior surface 201 s ofdome structure 201 d may be symmetrical about axis IA. For example, as shown inFIGS. 1A-1C ,exterior surface 201 s may be a symmetrical curved dome surface about axis IA (e.g., a dome structure of any suitable diameter (e.g., 6 inches to 8 inches or any other suitable range)). When no wires or structural markings are provided on housing 201 (e.g., ondome structure 201 d andbase structure 201 b),input assembly 200 may provide a structure of a consistent shape when viewed from any line of sight perpendicular to axis IA (e.g., as shown by the similarities betweenFIGS. 1A and 1B that may provide the same curvature C ofsurface 201 s). However,input assembly 200 may be described as having a particular assembly orientation, as varioussensor input components 210 or other components ofassembly 200 may be positioned withinhousing 201 in a particular orientation and/or calibrated with a particular orientation. Therefore, feet F, R, B, and L along with axis IA may be used herein to provide some context to a particular input assembly orientation or coordinate system. However, it is to be understood that, in some embodiments, such distinct feet may not exist or may not be visible to an end user (e.g., such feet may be hidden by a portion ofbase structure 201 b that may extend about all such feet and contact or almost contact surface 5), whereby there may be no features visible to an end user operative to structurally and/or visually distinguish one orientation ofassembly 200 from another orientation ofassembly 200 with respect tosurface 5 about an axis (e.g., axis IA). Alternatively,assembly 200 may be any other suitable shape that may not be symmetrical about axis IA (e.g.,surface 201 s may be rectangular and/or a cuboid or any other suitable two-dimensional or three-dimensional shape). -
FIGS. 1A-2F depictinput assembly 200 positioned onwork surface 5 with a particular orientation with respect to at least one portion ofelectronic device 100. As shown,electronic device 100 may include adisplay output component 112 with aplanar display screen 112 s with an outer surface that may exist in an X-Z plane of an X-Y-Z three-dimensional Cartesian coordinate system DC (device coordinate system DC), whilework surface 5 may exist in an X-Y plane of device coordinate system DC (although any other suitable relationship betweensurface 5 andscreen 112 s may exist).System 1 may be configured such thatelectronic device 100 may be displaying at least an input assembly controlled pointer orcursor 112 c ondisplay screen 112 s that may be controlled bydata 99 received frominput assembly 200 based on user U interaction withinput assembly 200. - Any suitable features of assembly 200 (e.g., feet F and B) may define an axis of an input coordinate system. For example, as shown, feet F and B may together define an axis FBA that may be operative to define a Y-axis of an input coordinate system NC, while axis IA may define a Z-axis of input coordinate system NC and an axis between feet L and R may define a Y-axis of input coordinate system NC. However, it is to be understood that any other suitable feature(s) and/or component(s) may be operative to define input coordinate system NC (e.g., any suitable axis/axes of any suitable sensing mechanism(s) of a
sensor input component 210 of assembly 200). As shown inFIGS. 1A-1H ,input assembly 200 and, thus, input coordinate system NC may be positioned in a first input orientation with respect todisplay screen 112 s and, thus, with respect to device coordinate system DC. As shown inFIGS. 1A-1H , the first input orientation ofassembly 200 with respect todevice 100 may include input coordinate system NC ofassembly 200 aligned with device coordinate system DC (e.g., axis X of input coordinate system NC is aligned with axis X of device coordinate system DC (e.g., feet L and R may be define a X-axis of input coordinate system NC and may be aligned along a X-axis of device coordinate system DC with foot R further in the +X direction), and an axis Y of input coordinate system NC (e.g., axis FBA) is aligned with axis Y of device coordinate system DC (e.g., with foot F further in the +Y direction closest to displayscreen 112 s), and an axis Z of input coordinate system NC (e.g., axis IA) is aligned with a Z-axis of device coordinate system DC). Alternatively, as shown inFIGS. 2A-2F ,input assembly 200 may be positioned with any other suitable second input orientation with respect todisplay screen 112 s (e.g.,housing 201 may be rotated 45° in the direction of arrow CW (clockwise) about axis Z of device coordinate system DC, whereby feet L and F may be equidistant fromdisplay screen 112 s). Therefore, as shown inFIGS. 2A-2H , the second input orientation ofassembly 200 and input coordinate system NC with respect todevice 100 and device coordinate system DC may include input coordinate system NC ofassembly 200 offset by 45° in the direction of arrow CW (e.g., about its Z-axis (e.g., axis IA)) from device coordinate system DC (e.g., axis X of input coordinate system NC is offset with axis X of device coordinate system DC by 45° in the direction of arrow CW and axis Y of input coordinate system NC is offset from axis Y of device coordinate system DC by 45° in the direction of arrow CW). However, it is to be understood thatinput assembly 200 may be positioned at any suitable orientation with respect todisplay screen 112 s or any other portion ofdevice 100 on surface 5 (e.g., input coordinate system NC may be oriented in any suitable orientation with respect to device coordinate system DC) andinput assembly 200 may still provide a structure of a substantially consistent shape (e.g., absent feet F, R, B, and L) when viewed from any line of sight perpendicular to axis IA (e.g., as shown by the similarities betweenFIGS. 1A, 1B, 2A, and 2B ). Therefore, no matter the orientation ofinput assembly 200 onsurface 5 with respect to device 100 (orsurface 5 or device coordinate system DC), user U may create any user orientation between a body part of user U andexterior surface 201 s ofdome structure 201 d (e.g., about axis IA) and still feel the same shape ofdome structure 201 d. - For example, no matter whether user U positions its hand on top of
exterior surface 201 s ofdome structure 201 d at a first user orientation ofFIG. 1D with respect toinput assembly 200, at a second user orientation ofFIG. 1E with respect toinput assembly 200, or at a third user orientation ofFIG. 1F with respect toinput assembly 200, user U may feel the same or substantially similar structural shape ofdome structure 201 d. This may enable user U to “blindly” reach for and interact withinput assembly 200, no matter the user orientation withinput assembly 200, and still receive the same structural feedback from input assembly 200 (e.g.,input assembly 200 may feel the same to user U no matter which of many various user orientations user U's hand may have with respect to input assembly 200). For example, as shown inFIG. 1D , the first user orientation may include user U's right hand positioned on top ofexterior surface 201 s such that a right middle finger axis RMA of user U's right middle finger may be axially aligned with feet F and B ofinput assembly 200 and, thus, with axis Y of input coordinate system NC and, thus, with axis Y of device coordinate system DC due to the first orientation ofassembly 200 and system NC with respect todevice 100 and system DC. Axis RMA may be operative to define a Y-axis of a user coordinate system UC, which may include an X-Y plane substantially parallel tosurface 5. Therefore, inFIG. 1D , the first user orientation may include user coordinate system UC of user U's right hand aligned with device coordinate system DC (e.g., axis X of user coordinate system UC is aligned with axis X of device coordinate system DC and axis Y of user coordinate system UC is aligned with axis Y of device coordinate system DC) and with input coordinate system NC. As shown inFIG. 1E , the second user orientation may include user U's right hand positioned on top ofexterior surface 201 s such that right middle finger axis RMA may be axially offset by 45° in the direction of arrow CW from a line extending through feet F and B ofinput assembly 200 and, thus, axially offset by 45° in the direction of arrow CW with axis Y of each one of device coordinate system DC and input coordinate system NC. Therefore, inFIG. 1E , the second user orientation may include user coordinate system UC of user U's right hand offset by 45° in the direction of arrow CW from input coordinate system NC and from device coordinate system DC (e.g., axis X of user coordinate system UC is offset by 45° in the direction of arrow CW from axis X of device coordinate system DC and axis Y of user coordinate system UC is offset by 45° in the direction of arrow CW from with axis Y of device coordinate system DC). While, as shown inFIG. 1F , the third user orientation may include user U's right hand positioned on top ofexterior surface 201 s such that right middle finger axis RMA may be axially offset by 45° in the direction of arrow CCW (counter clockwise) from a line extending through feet F and B ofinput assembly 200 and, thus, axially offset by 45° in the direction of arrow CCW with axis Y of each one of device coordinate system DC and input coordinate system NC. Therefore, inFIG. 1F , the second user orientation may include user coordinate system UC of user U's right hand offset by 45° in the direction of arrow CCW from input coordinate system NC and from device coordinate system DC (e.g., axis X of user coordinate system UC is offset by 45° in the direction of arrow CCW from axis X of device coordinate system DC and axis Y of user coordinate system UC is offset by 45° in the direction of arrow CCW from with axis Y of device coordinate system DC). Yet, no matter which of these or any other similar user orientations user U's right hand may have with respect to the top ofexterior surface 201 s (e.g., with respect to axis IA),exterior surface 201 s may feel the same to user U. - As another example, no matter whether user U positions its hand on top of
exterior surface 201 s ofdome structure 201 d at a fourth user orientation ofFIG. 2D with respect toinput assembly 200, at a fifth user orientation ofFIG. 2E with respect toinput assembly 200, or at a sixth user orientation ofFIG. 2F with respect toinput assembly 200, user U may feel the same or substantially similar structural shape ofdome structure 201 d. This may enable user U to “blindly” reach for and interact withinput assembly 200, no matter the user orientation withinput assembly 200 and no matter the orientation ofinput assembly 200 todevice 100, and still receive the same structural feedback from input assembly 200 (e.g.,input assembly 200 may feel the same to user U no matter which of many various user orientations user U's hand may have with respect to input assembly 200). For example, as shown inFIG. 2D , the fourth user orientation may include user U's right hand positioned on top ofexterior surface 201 s such that the Y-axis of user coordinate system UC (e.g., right middle finger axis RMA) may be axially offset by 45° in the direction of arrow CCW from the Y-axis of input coordinate system NC of input assembly 200 (e.g., a line extending through feet F and B) but axially aligned with axis Y of device coordinate system DC. As shown inFIG. 2E , the fifth user orientation may include user U's right hand positioned on top ofexterior surface 201 s such that the Y-axis of user coordinate system UC (e.g., right middle finger axis RMA) may be axially aligned with the Y-axis of input coordinate system NC of input assembly 200 (e.g., a line extending through feet F and B) but axially offset by 45° in the direction of arrow CW with axis Y of device coordinate system DC, while, as shown inFIG. 2F , the sixth user orientation may include user U's right hand positioned on top ofexterior surface 201 s such that the Y-axis of user coordinate system UC (e.g., right middle finger axis RMA) may be axially offset by 90° in the direction of arrow CCW from the Y-axis of input coordinate system NC ofinput assembly 200 but only axially offset by 45° in the direction of arrow CCW from the Y-axis of device coordinate system DC ofdevice 100. Yet, no matter which of these or any other similar user orientations user U's right hand may have with respect to the top ofexterior surface 201 s (e.g., with respect to axis IA),exterior surface 201 s may feel the same to user U. Therefore, in some embodiments, depending on a current orientation of input coordinate system NC ofassembly 200 with respect to device coordinate system DC ofdevice 100, a particular orientation of user coordinate system UC of user U with respect to input coordinate system NC ofassembly 200 may be the same as or different than a particular orientation of user coordinate system UC of user U with respect to device coordinate system DC ofdevice 100. -
System 1 may be configured to determine a current user orientation of user U (e.g., of user coordinate system UC) with respect to assembly 200 (e.g., with respect to input coordinate system NC) using any suitable sensing components ofsystem 1. For example, any suitablesenor input components 210 of assembly 200 (e.g., hover or near touch sensors or any suitable touch sensors that may be integrated into and/or under and/or aboutexterior surface 201 s) may be operative to detect the orientation of one or more body parts (e.g., palm, hand, one or more fingers, etc.) of user U on top of (e.g., physically against or hovering above but not physically touching)exterior surface 201 s to determine the orientation of user coordinate system UC of user U with respect to input coordinate system NC of assembly 200 (e.g., to determine the current user orientation of right middle finger axis RMA of user U's right hand and/or of a left middle finger axis of user U's left hand and/or of any other suitable feature or combination of features of any suitable interaction appendage of the user and/or of any suitable instrument used by the user to interact withsurface 201 s, with respect to axis FBA of assembly 200). For example, as described in commonly-assigned, co-pending U.S. Non-Provisional Patent Application Publication No. 2008/0297478, which is hereby incorporated by reference herein in its entirety, various sensors may be used to accurately determine a current relationship between a user's hand and a mouse (e.g., by comparing detected user features with baseline features). As another example, as described in commonly-assigned, co-pending U.S. Non-Provisional Patent Application Publication No. 2017/0038905, which is hereby incorporated by reference herein in its entirety, various sensors may be used to accurately determine a location of one or more fingers on a surface of a device. While any suitable user characteristic or characteristics may be identified in order to determine the current user orientation of user U with respect to input assembly 200 (e.g., spacing and/or orientation between finger tips, palm print, and/or the like (e.g., average direction of each detected finger of a hand of a user with respect to any suitable dimension of assembly 200)), identification of a middle finger (e.g., right middle finger axis RMA of user U for a right hand of a user) may enable identification of user coordinate system UC and may be referred to herein for determining a current user orientation of user U to inputassembly 200. Such user orientation determination may then be used to map any detected physical manipulations ofinput assembly 200 by user U to particular commands that may appropriately control the user interface. In some embodiments,system 1 may be operative to determine axis RMA and, thus, to determine user coordinate system UC that may use axis RMA as its Y-axis. Then,system 1 may be operative to detect any user physical manipulations ofinput assembly 200 by the user with respect to determined user coordinate system UC, and then to consistently map such physical manipulations within user coordinate system UC to user interface manipulations within device coordinate system DC. - Determination of a current user orientation with respect to input assembly 200 (e.g., determination of a current orientation of user coordinate system UC of user U with respect to input coordinate system NC of assembly 200) may then be used to map particular user physical manipulations of
input assembly 200 to particular types ofcontrol data 99 that may be communicated todevice 100 for controlling device 100 (e.g., for controllingcursor 112 c onscreen 112 s). This may enable consistent control data to be generated in response to a particular user physical gesture imparted by user U oninput assembly 200 no matter what the orientation ofinput assembly 200 is to electronic device 100 (e.g., no matter the orientation of input coordinate system NC ofassembly 200 with respect to device coordinate system DC of device 100) and/or no matter what the orientation ofinput assembly 200 is to user U (e.g., no matter the orientation of input coordinate system NC ofassembly 200 with respect to user coordinate system UC of user U) and/or no matter what the orientation of user U is to electronic device 100 (e.g., no matter the orientation of user coordinate system UC of user U with respect to device coordinate system DC of device 100). Once user coordinate system UC may be determined byinput assembly 200 for a current user position or a current user orientation of user U with respect to input assembly 200 (e.g., current orientation of user coordinate system UC with respect to input coordinate system NC), a physical manipulation ofinput assembly 200 as initiated by user U from that determined current user position or orientation may be detected with respect to the determined user coordinate system UC of that current user position and then that detected physical manipulation with respect to user coordinate system UC may be mapped to user interface manipulations with device coordinate system DC for generating theappropriate control data 99, no matter what the orientation is of system UC with respect to system NC. Such a process may yield consistent control data for the same physical manipulation ofassembly 200 by a user with respect to a determined user coordinate system UC, no matter what the relationship between that user coordinate system UC and input coordinate system NC, and/or no matter what the relationship between that user coordinate system UC and device coordinate system DC, and/or no matter what the relationship between input coordinate system NC and device coordinate system DC. -
FIG. 3 is a flowchart of anillustrative process 300 for using an input assembly for controlling an electronic device, such as for usinginput assembly 200 for controllingelectronic device 100. Atoperation 302 ofprocess 300, the input assembly may determine whether a user interface event has been detected. If a user interface event is not detected atoperation 302, thenoperation 302 may be repeated until a user interface event is detected or until any suitable interrupt ofprocess 300 may be received. However, if a user interface event is detected atoperation 302, then process 300 may advance tooperation 304, where the input assembly may define a current user coordinate system based on the user interface event detected atoperation 302. For example, any suitable type of interaction by a user with an input assembly may be detected as a user interface event atoperation 302, including, but not limited to, detection of at least a portion of a user's hand in physical contact with any suitable surface (e.g.,surface 201 s) of the input assembly (e.g., instantaneous detection or detection of substantially consistent contact for a threshold period of time (e.g., 300 milliseconds, 500 milliseconds, etc.), etc.), detection of at least a portion of a user's hand hovering adjacent to (e.g., within a threshold distance of) but not contacting the input assembly (e.g., instantaneous detection or detection of substantially consistent hovering for a threshold period of time (e.g., 300 milliseconds, 500 milliseconds, etc.), etc.), and/or the like. For example, if a user's relationship withsystem 1 changed from that ofFIG. 1C (e.g., at which no user contact or interaction may be detected with respect to stationary assembly 200) to that ofFIG. 1D (e.g., at which a user's hand may be positioned over or onsurface 201 s of stationary assembly 200), then a user interface event may be detected (e.g., at operation 302) andsystem 1 may attempt to detect a current user coordinate system (e.g., system UC ofFIG. 1D (e.g., at operation 304)). Atoperation 304, the input assembly may define a current user coordinate system, such as user coordinate system UC, based on the user interface event detected atoperation 302. For example, detection of at least a user's middle finger in contact withsurface 201 s (e.g., consistent contact at a particular position for a particular threshold length of time) may be a user interface event detected atoperation 302, and then an appropriate user coordinate system UC may be defined by the input assembly atoperation 304 based on that detected position of the user's middle finger. As mentioned, a relationship between the current input coordinate system NC ofinput assembly 200 atoperation 304 and the current orientation of the user U with respect to that system NC as may be used to define user coordinate system UC of user U atoperation 304 may be any suitable relationship (e.g., aligned, offset about axis IA by 45°, offset about axis IA by 90° , etc. (e.g., one of at least 3 possible user coordinate systems (e.g., one of at least 360 possible user coordinate systems (e.g., one for each degree of rotation about axis IA))), and variations in that relationship may not vary the control data eventually generated byprocess 300. However, the relationship between current input coordinate system NC ofinput assembly 200 atoperation 304 and the current user coordinate system UC atoperation 304 may be utilized internally byinput assembly 200 to track movement of the user with respect toinput assembly 200. - After a current user coordinate system has been defined at
operation 304,process 300 may advance tooperation 306, where the input assembly may determine whether a reset interface event has been detected. If a reset interface event is detected atoperation 306, then process 300 may return tooperation 302 until a user interface event is detected or until any suitable interrupt ofprocess 300 may be received. For example, if a reset interface event is detected,process 300 may return tooperation 302 oroperation 304 to potentially define a new current user coordinate system. However, if a reset interface event is not detected atoperation 306, then process 300 may advance tooperation 308, where the input assembly may determine whether any physical use of the input assembly has been detected. For example, any suitable type of interaction by a user with an input assembly may be detected as a reset interface event atoperation 306, including, but not limited to, detection of at least a portion or the entirety of a user's hand terminating physical contact with any suitable surface (e.g.,surface 201 s) of the input assembly (e.g., instantaneous detection of no contact or detection of consistent lack of contact for a threshold period of time (e.g., 300 milliseconds, 500 milliseconds, etc.), etc.), detection of no movement of a user's hand or of at least a portion of a user's hand for at least a threshold period of time (e.g., no detected movement of a user's hand alongsurface 201 s that is contactingsurface 201 s), and/or the like. For example, if a user's relationship withsystem 1 changed from that ofFIG. 1D (e.g., at which a first current user coordinate system UC may be defined as shown) to that ofFIG. 1C (e.g., at which no user contact or interaction may be detected), then a reset event may be detected (e.g., at operation 306) andsystem 1 may attempt to detect a new user interface event (e.g., at operation 302). As another example, if a user's relationship withsystem 1 changed from that ofFIG. 1D (e.g., at which a first current user coordinate system UC may be defined as shown) to that ofFIG. 1E and then the user did not move from the position ofFIG. 1F for a particular period of time, then a reset event may be detected (e.g., at operation 306) andsystem 1 may attempt to detect a new user interface event (e.g., at operation 302), which may be immediately detected due to the user's current interaction withassembly 200, such that a second current user coordinate system UC may then be defined to be the system UC ofFIG. 1E (e.g., at operation 304). - If a reset interface event is not detected at
operation 306, then process 300 may advance tooperation 308, where the input assembly may determine whether any physical use of the input assembly has been detected. If no physical use of the input assembly is detected atoperation 308, then process 300 may return tooperation 306 to determine if any reset interface event may be detected. However, if any physical use of the input assembly is detected atoperation 308, then process 300 may advance tooperation 310, where appropriate control data may be determined based on the physical use detected atoperation 308 and based on the current user coordinate system defined atoperation 304, where such control data may be used to control the operation of any suitable electronic device (e.g., device 100), where such control data may be generated by the input assembly and/or by the electronic device. Any suitable physical use may be detected atoperation 308, such as physical movement ofinput assembly 200 with respect to (“w/r/t”) or along work surface 5 (e.g., any movement ofassembly 200 along any path alongsurface 5 or any rotation of the entirety ofassembly 200 about axis IA), movement of a portion of user U (e.g., a finger tip or multiple finger tips) alongsurface 201 s ofassembly 200, tapping or force pressing by a portion of user U (e.g., a finger tip or multiple finger tips) downward intosurface 201 s ofassembly 200, movement of a portion of input assembly 200 (e.g., rotation ofdome structure 201 d with respect tobase structure 201 b, axial shear force ofdome structure 201 d with respect tobase structure 201 b, etc.), and/or the like. - It is understood that the operations shown in
process 300 ofFIG. 3 are only illustrative and that existing operations may be modified or omitted, additional operations may be added, and the order of certain operations may be altered. -
Application 203 ofinput assembly 200 and/orapplication 103 ofelectronic device 103 may be developed to include a rule system, such as a rule system that may be at least partially represented by media rule system Table 1 provided below, that may include various rules, where each rule may be associated with a particular control action and with a particular type of physical use defined with respect to a particular current user coordinate system. For example, as shown in Table 1, each one of rules R1-R12 may be associated with or be defined to include a particular type of physical use (e.g., as may be detected at operation 308) defined with respect to a current user coordinate system UC (e.g., as may be defined at operation 304) and a particular control action that may be used to define particular control data (e.g., control data to be determined at operation 310), where such control data may be used by device 100 (e.g., by processor 102) to carry out the particular control action defined by the control data. -
TABLE 1 Rule Physical Use w/r/t User Coordinate System (UC) Control Action R1 Movement of Assembly w/r/t Work Surface in +X of UC Move Cursor along +X of DC R2 Movement of Assembly w/r/t Work Surface in −X of UC Move Cursor along −X of DC R3 Movement of Assembly w/r/t Work Surface in +Y of UC Move Cursor along +Z of DC R4 Movement of Assembly w/r/t Work Surface in −Y of UC Move Cursor along −Z of DC R5 Rotation of User w/r/t Assembly Surface in CW of UC Rotate Cursor CW R6 Rotation of User w/r/t Assembly Surface in CCW of UC Rotate Cursor CCW R7 Movement of User w/r/t Assembly Surface in +Y of UC Make Cursor bigger R8 Movement of User w/r/t Assembly Surface in −Y of UC Make Cursor smaller R9 Rotation of Assembly w/r/t Work Surface in CW of UC Make Cursor Brighter R10 Rotation of Assembly w/r/t Work Surface in CCW of UC Make Cursor Darker R11 Movement of Assembly Portion in +X of UC Bounce Cursor along +X of DC R12 Movement of Assembly Portion in −X of UC Bounce Cursor along −X of DC - As shown with by rules R1-R4 of Table 1, for example,
system 1 may be configured to map the X-axis of user coordinate system UC to the X-axis of device coordinate system DC and the Y-axis of user coordinate system UC to the Z-axis of device coordinate system DC for detected physical use that involves movement ofinput assembly 200 alongwork surface 5. In such embodiments, a user physical manipulation ofinput assembly 200 in the U+X direction (e.g., as determined at operation 308) of a determined current user coordinate system UC (e.g., as determined at operation 304) may be mapped by rule R1 (e.g., at operation 310) to a user interface manipulation ofdevice 100 in the +X direction of device coordinate system DC (e.g., when user U physically slidesinput assembly 200 alongsurface 5 in the U+X direction of a determined current user coordinate system UC of any one ofFIGS. 1D-1F and 2D-2F (e.g., no matter the orientation ofinput assembly 200 todevice 100, no matter the orientation of user U to inputassembly 200, and/or no matter the orientation of user U to device 100), then control data may be determined that may be operative to instructdevice 100 to correspondingly movecursor 112 c alongscreen 112 s in the +X direction of device coordinate system DC). As another example, in such embodiments, a user physical manipulation ofinput assembly 200 in the U-Y direction (e.g., as determined at operation 308) of a determined current user coordinate system UC (e.g., as determined at operation 304) may be mapped by Rule R4 (e.g., at operation 310) to a user interface manipulation ofdevice 100 in the −Z direction of device coordinate system DC (e.g., when user U physically slidesinput assembly 200 alongsurface 5 in the U-Y direction of a determined current user coordinate system UC of any one ofFIGS. 1D-1F and 2D-2F (e.g., no matter the orientation ofinput assembly 200 todevice 100, no matter the orientation of user U to inputassembly 200, and/or no matter the orientation of user U to device 100), then control data may be determined that may be operative to instructdevice 100 to correspondingly movecursor 112 c alongscreen 112 s in the −Z direction of device coordinate system DC). - As a particular user physical manipulation of
input assembly 200 by user U may be detected bysystem 1 in the context of a determined user coordinate system UC of a current user orientation with respect toinput assembly 200, one or more sensors or applications or processors or otherwise ofassembly 200 may be operative to be recalibrated or mapped based on the current determined user coordinate system UC to detect user physical manipulation within that user coordinate system UC. This may enable different types of movement ofassembly 200 with respect to input coordinate system NC to result in the same control data when those different types of movement are the result of the same type of movement with respect to a current user coordinate system UC. For example, this may enable not only the movement ofassembly 200 ofFIG. 1D by user U in the +Y direction of coordinate system DC (e.g., in the U+Y direction of current determined user coordinate system UC for moving foot F closer todevice 100 ofFIG. 1D in the +Y direction of coordinate system DC) but also the movement ofassembly 200 ofFIG. 2F by user U in the −X/+Y direction of coordinate system DC (e.g., in the U+Y direction of current determined user coordinate system UC (e.g., in the −X direction of input coordinate system NC) for moving foot F slightly closer todevice 100 ofFIG. 2F in the −X/+Y direction of coordinate system DC) to controldevice 100 in the same manner (e.g., to movecursor 112 c alongscreen 112 s in the +Z direction of device coordinate system DC (e.g., Rule R3 of Table 1). Any suitable sensor input component(s) 210 may be used to sense user physical manipulation ofinput assembly 200 alongsurface 5, such as any suitable optical sensor(s), a track ball, or the like. - As another example, a user physical manipulation of
input assembly 200 may be to physically move a body part of user U with respect to (e.g., alongexterior surface 201 s). For example, when the current user coordinate system has been defined (e.g., at operation 304) to be user coordinate system UC ofFIG. 1D , and then user U physically rotates its right hand alongexterior surface 201 s ofassembly 200 in the direction of arrow CW by 45° about axis IA from the orientation ofFIG. 1D to the orientation ofFIG. 1E (e.g., as may be detected at operation 308), such detected physical use may be mapped by rule R5 (e.g., at operation 310) to determine control data that may be operative to rotate an object selected bycursor 112 c by 45° CW. Similarly, when the current user coordinate system has been defined (e.g., at operation 304) to be user coordinate system UC ofFIG. 2D , and then user U physically rotates its right hand alongexterior surface 201 s ofassembly 200 in the direction of arrow CW by 45° about axis IA from the orientation ofFIG. 2D to the orientation ofFIG. 2E (e.g., as may be detected at operation 308), such detected physical use may be mapped by rule R5 (e.g., at operation 310) to determine control data that may be operative to rotate an object selected bycursor 112 c by 45° CW. Similarly, when the current user coordinate system has been defined (e.g., at operation 304) to be user coordinate system UC ofFIG. 2F , and then user U physically rotates its right hand alongexterior surface 201 s ofassembly 200 in the direction of arrow CW by 45° about axis IA from the orientation ofFIG. 2F to the orientation ofFIG. 2D (e.g., as may be detected at operation 308), such detected physical use may be mapped by rule R5 (e.g., at operation 310) to determine control data that may be operative to rotate an object selected bycursor 112 c by 45° CW. Therefore, each one of these three exemplary rotations of user U by 45° CW with respect tosurface 201 s ofassembly 200 may result in the same control data being determined by process 300 (e.g., control data that may be operative to rotate an object selected bycursor 112 c by 45° CW), despite each one of the three exemplary rotations having a differently defined current user coordinate system UC (e.g., UC ofFIG. 1D , UC ofFIG. 2D , and UC ofFIG. 2F , respectively), and despite each one of those three differently defined current user coordinate systems UC having a different relationship with device coordinate system DC and/or with input coordinate system NC than one or each of the other differently defined current user coordinate systems UC. For example, system UC ofFIG. 1D and system UC ofFIG. 2D have the same relationship with system DC (e.g., X-axes aligned), but system UC ofFIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC ofFIG. 2D has with system NC (e.g., X- and Y-axes not aligned). As another example, system UC ofFIG. 1D has a different relationship with system DC (e.g., X-axes aligned) than system UC ofFIG. 2F has with system DC (e.g., X- and Y-axes not aligned), and system UC ofFIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC ofFIG. 2F has with system NC (e.g., Y-axis of UC aligned with X-axis of NC), and system NC ofFIG. 1D has a different relationship with system DC (e.g., X- and Y-axes aligned) than system NC ofFIG. 2F has with system DC (e.g., X- and Y-axes not aligned). - For example, user U may physically rotate its right hand along
exterior surface 201 s in the direction of arrow CW by 45° about axis IA from the orientation ofFIG. 1D to the orientation ofFIG. 1E , where the current user orientation of axis RMA of user U may rotate with respect to assembly 200 (e.g., axis RMA may rotate from being aligned with feet F and B atFIG. 1D to being offset between feet F and R ofFIG. 1E (e.g., system UC may rotate with respect to system NC in the direction of arrow CW by 45° , yet system 1 (e.g., one or more motionsensor input components 210 or touch sensor (e.g., multi-touch)sensor input components 210 of assembly 200) may be operative to detect the movement of user U alongexterior surface 201 s with respect to the initial current determined user coordinate system UC ofFIG. 1D to detect the 45° CW rotation and may be operative to generateparticular command data 99 for controlling an interface ofdevice 100 in a particular manner (e.g., rotate an object selected bycursor 112 c by 45° CW) (e.g., the control action of Rule 5). Similarly, user U may physically rotate its right hand alongexterior surface 201 s in the direction of arrow CW by 45° about axis IA from the orientation ofFIG. 2F to the orientation ofFIG. 2D , where the current user orientation of axis RMA of user U may rotate with respect to assembly 200 (e.g., axis RMA may rotate from being aligned with feet L and R atFIG. 2F to being offset between feet L and F ofFIG. 2D (e.g., system UC may rotate with respect to system NC in the direction of arrow CW by)) 45° , yet system 1 (e.g., one or more motionsensor input components 210 or touch sensor (e.g., multi-touch)sensor input components 210 of assembly 200) may be operative to detect the movement of user U alongexterior surface 201 s with respect to the initial current determined user coordinate system UC ofFIG. 2F to detect the 45° CW rotation and may be operative to generateparticular command data 99 for controlling an interface ofdevice 100 in a particular manner (e.g., rotate an object selected bycursor 112 c by 45° CW) (e.g., the control action of Rule 5), which may be the same as the particular manner based on user physical manipulation betweenFIGS. 1D and 1E , as the same rotation of user U with respect toexterior surface 201 s occurred despite the orientation of user U todevice 100 not being the same between the two physical manipulations. In such embodiments, the initial current determined user coordinate system UC (e.g., ofFIG. 1D or ofFIG. 2F ) may be used for providing context to the entire physical manipulation (e.g., rotation) of user U with respect toexterior surface 201 s (e.g., rather than updating the determined user coordinate system UC). Therefore, in some embodiments, once an initial current determined user coordinate system UC may be determined bysystem 1, that same determined user coordinate system UC may be used for providing context to any detected user physical manipulation, as long as one or more rules are followed (e.g., as long as user U does not completely break contact withassembly 200 during the manipulation or otherwise interact withassembly 200 in a manner that may causesystem 1 to attempt to reset the current determined user coordinate system UC and determine a new user coordinate system UC (e.g., provide no movement for more than a particular threshold of time)). - As another example, a user physical manipulation of
input assembly 200 may be to physically move a body part of user U with respect to (e.g., alongexterior surface 201 s) in a different manner (e.g., just a portion of a finger rather than all fingers and palm). For example, when the current user coordinate system has been defined (e.g., at operation 304) to be user coordinate system UC ofFIG. 1D , and then user U physically flicks its middle finger alongexterior surface 201 s ofassembly 200 in the +Y direction (e.g., in U+Y direction of system UC ofFIG. 1D ) (e.g., as may be detected at operation 308), such detected physical use may be mapped by rule R7 (e.g., at operation 310) to determine control data that may be operative to makecursor 112 c bigger. Similarly, when the current user coordinate system has been defined (e.g., at operation 304) to be user coordinate system UC ofFIG. 2D , and then user U physically flicks its middle finger alongexterior surface 201 s ofassembly 200 in the +Y direction (e.g., in U+Y direction of system UC ofFIG. 2D ) (e.g., as may be detected at operation 308), such detected physical use may be mapped by rule R7 (e.g., at operation 310) to determine control data that may be operative to makecursor 112 c bigger. Similarly, when the current user coordinate system has been defined (e.g., at operation 304) to be user coordinate system UC ofFIG. 2F , and then user U physically flicks its middle finger alongexterior surface 201 s ofassembly 200 in the +Y direction (e.g., in U+Y direction of system UC ofFIG. 2F ) (e.g., as may be detected at operation 308), such detected physical use may be mapped by rule R7 (e.g., at operation 310) to determine control data that may be operative to makecursor 112 c bigger. Therefore, each one of these three exemplary flicks of a finger of user U alongsurface 201 s ofassembly 200 in the +Y direction of the current user coordinate system UC may result in the same control data being determined by process 300 (e.g., control data that may be operative to makecursor 112 c bigger (e.g., control data based on action of Rule R7)), despite each one of the three exemplary finger flicks having a differently defined current user coordinate system UC (e.g., UC ofFIG. 1D , UC ofFIG. 2D , and UC ofFIG. 2F , respectively), and despite each one of those three differently defined current user coordinate systems UC having a different relationship with device coordinate system DC and/or with input coordinate system NC than one or each of the other differently defined current user coordinate systems UC. For example, system UC ofFIG. 1D and system UC ofFIG. 2D have the same relationship with system DC (e.g., X-axes aligned), but system UC ofFIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC ofFIG. 2D has with system NC (e.g., X- and Y-axes not aligned). As another example, system UC ofFIG. 1D has a different relationship with system DC (e.g., X-axes aligned) than system UC ofFIG. 2F has with system DC (e.g., X- and Y-axes not aligned), and system UC ofFIG. 1D has a different relationship with system NC (e.g., X- and Y-axes aligned) than system UC ofFIG. 2F has with system NC (e.g., Y-axis of UC aligned with X-axis of NC), and system NC ofFIG. 1D has a different relationship with system DC (e.g., X- and Y-axes aligned) than system NC ofFIG. 2F has with system DC (e.g., X- and Y-axes not aligned). - As another example, a user physical manipulation of
input assembly 200 may be to physically rotatehousing 201 ofassembly 200 about axis IA onsurface 5. For example, user U may physically rotateassembly 200 in the direction of arrow CW by 45° about axis IA onsurface 5 from the orientation ofFIG. 1D to the orientation ofFIG. 2E , where the current user orientation of axis RMA of user U may remain the same with respect to assembly 200 (e.g., axis RMA may remain aligned with feet F and B between the orientation ofFIG. 1D and the orientation ofFIG. 2E (e.g., the orientation of system UC to system NC may remain constant during rotation ofassembly 200 by user U fromFIG. 1D toFIG. 2E )), yet system 1 (e.g., one or more motionsensor input components 210 of assembly 200) may be operative to detect the rotation ofassembly 200 and axis RMA of current determined user coordinate system UC about axis IA, and such detected 45° CW rotation may be operative to generateparticular command data 99 for controlling an interface ofdevice 100 in a particular manner (e.g., control data based on the action of Rule R9 (e.g., makecursor 112 c brighter)). Similarly, user U may physically rotateassembly 200 in the direction of arrow CW by 45° about axis IA onsurface 5 from the orientation ofFIG. 1F to the orientation ofFIG. 2D , where the current user orientation of axis RMA of user U may remain the same with respect to assembly 200 (e.g., axis RMA may remain aligned between feet L and F between the orientation ofFIG. 1F and the orientation ofFIG. 2D ), yet system 1 (e.g., one or more motionsensor input components 210 of assembly 200) may be operative to detect the rotation ofassembly 200 and axis RMA of current determined user coordinate system UC about axis IA, and such detected 45° CW rotation may be operative to generateparticular command data 99 for controlling an interface ofdevice 100 in a particular manner (e.g., control data based on the action of Rule R9 (e.g., makecursor 112 c brighter)), which may be the same as the particular manner based on user physical manipulation betweenFIGS. 1D and 2E , as the same rotation of user U andassembly 200 occurred despite the orientation of user U todevice 100 not being the same between the two physical manipulations (e.g., despite the orientation of system UC to system DC not being the same between the two physical manipulations) and despite the orientation of user U toassembly 200 not being the same between the two physical manipulations (e.g., despite the orientation of system UC to system NC not being the same between the two physical manipulations). - As another example, a user physical manipulation of
input assembly 200 may be to physically movedome structure 201 d with respect tobase structure 201 b. For example, a user physical manipulation ofassembly 200 may movedome structure 201 d in the U+X direction of a current determined user coordinate system UC with respect tobase structure 201 b from the position ofFIG. 1G to the position ofFIG. 1H by a distance D. Any suitable sensor input component(s) 210 of assembly 200 (e.g., any suitable shear force sensor(s) 210 s (e.g., with haptic feedback), which may be provided betweenstructures device 100 in the +X direction of device coordinate system DC (e.g., when user U physically movesstructure 201 d with respect to structure 201 b by distance D in the U+X direction of a determined current user coordinate system UC of any one ofFIGS. 1D-1F and 2D-2F (i.e., no matter the orientation ofinput assembly 200 to device 100), then system 1 (e.g.,assembly 200 and/or device 100) may be configured to generate and communicatecontrol data 99 todevice 100 that may be operative to instructdevice 100 to correspondingly bounce (or otherwise manipulate)cursor 112 c alongscreen 112 s in the +X direction of device coordinate system DC by a distance proportional to distance D) (e.g., based on the action of Rule 11). Movement ofstructure 201 d with respect to structure 201 b may be enabled in any suitable directions (e.g., 2, 4, 8, 16, 32, or more directions with an X-Y plane of a user coordinate system UC), such thatassembly 200 may be manipulated like an analog joystick controller. - Any suitable user physical interactions with respect to (e.g., any suitable physical use of)
assembly 200 may be detected by system 1 (e.g., at operation 308) for controlling an interface ofdevice 100 according to the concepts of this disclosure (e.g., for mapping (e.g., at operation 310) a user physical manipulation as detected (e.g., at operation 308) with respect to a determined user coordinate system UC (e.g., as defined at operation at 304) to a particular interface manipulation with respect to device coordinate system DC). For example, any suitable multi-touch sensor input component(s) 210 may be provided alongdome exterior surface 201 s to detect any suitable touch gestures by user U alongexterior surface 201 s (e.g., pinch to zoom between a thumb and index finger, full hand rotation (as mentioned above), scroll wheel by a single finger flicking motion (e.g., using a physical encoder or otherwise), scroll wheel by a single finger circular path motion (e.g., a circular dome shapedsurface 201 s may be more conducive to facilitating a circular finger motion than a flat surface), single or multi-finger clicks (e.g., each finger may be tapped onsurface 201 s and detected as that particular finger such thatsystem 1 may be operative to associate with different finger clicks with different user control commands for device 100), two or three or four finger gestures (e.g., clicks or relative movement onsurface 201 s). Various touch sensor technologies may be used with a curvedexterior surface 201 s, such as capacitive touch sensor technologies (e.g., carbon nanobud, metal wire, metal mesh, conductive fabric, flexible circuitry (e.g., polyethylene terephthalate (“PET”), polyethylene naphthalate (“PEN”), polyimide (“PI”), etc.)), optical touch sensor technologies (e.g., frustrated total internal reflection (“FTIR”) multi-touch technology, etc.), ultrasonic touch sensor technologies, and/or the like. Any suitable touch sensing may also be enabled to detect force (e.g., a magnitude of pressure or force exerted by a user at each touch event), such as in vertical, horizontal, and/or rotational axes with respect tosurface 201 s. Additionally or alternatively, any suitable optical sensor and/or inertial measurement unit (“IMU”) sensor input component(s) 210 ofassembly 200 may be operative to detect physical rotation ofdome structure 201 d with respect tobase structure 201 b (e.g., about axis IA), which may be operative to enableassembly 200 to be physically manipulated for use as a scroll wheel (e.g., physical encoder or otherwise). Any suitable haptic and/or audible and/or visual feedback may be provided by any suitable output component(s) 212 ofassembly 200 to help user U confidently interact withsystem 1. - Therefore, such an
assembly 200 withdome housing structure 201 d may provide not only an ambidextrous design that may be similarly used by either the left or right hand of a user, but also an orientationless design (e.g., about axis IA) that may be similarly used by any hand at any user orientation with respect to any component(s) of assembly 200 (e.g., at any orientation of system UC of any hand with respect to system NC (e.g., any one of 3 or more (e.g., 360) such orientations)), while providing consistent and expected device control ofdevice 100. By determining a current user orientation (e.g., out of three or more possible orientations (e.g., 360 orientations for 360° rotation of user U's hand about axis IA)) with respect to assembly 200 (e.g., by sensing the position of one or more body parts of a user (e.g., the position of one or more types of digits relative to one another) using heat sensing, touch sensing, comparisons to known user orientations, etc.) for defining a user coordinate system UC with respect to input coordinate system NC in which one or more user physical manipulations ofassembly 200 by user U may then be detected, a particular user manipulation within user coordinate system UC may consistently controldevice 100 in the same manner despite that user coordinate system UC being able to have multiple orientations with respect to input coordinate system NC ofassembly 200 and/or with respect to device coordinate system DC ofdevice 100. This may provide intuitive user control while also providing an input assembly with a pleasing physical appearance that is naturally ergonomic (e.g., to a cupped palm of a user). It is also to be understood thatassembly 200 may have any suitable size and/or shape, such as a flat rectangle, and may not necessarily be domed and/or orientationless with respect to one or more axes. - Regardless of how input coordinate system NC of
assembly 200 may currently be oriented with respect to device coordinate system DC ofdevice 100 when a current or initial or baseline user coordinate system UC of user U may be determined and defined (e.g., with respect to assembly 200), any particular physical manipulation ofassembly 200 with respect to that defined user coordinate system UC may result in the same user control of device 100 (e.g., the same manipulation ofcursor 112 c). - Moreover, the processes described with respect to any aspects of the disclosure may each be implemented by software, but may also be implemented in hardware, firmware, or any combination of software, hardware, and firmware. They each may also be embodied as computer-readable code recorded on a computer-readable medium. The computer-readable medium may be any data storage device that can store data or instructions which can thereafter be read by a computer system. Examples of the computer-readable medium may include, but are not limited to, read-only memory, random-access memory, flash memory, CD-ROMs, DVDs, magnetic tape, and optical data storage devices (e.g.,
memory 104 and/ormemory 204 ofFIG. 1 ). The computer-readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. For example, the computer-readable medium may be communicated from one electronic device to another electronic device using any suitable communications protocol (e.g., the computer-readable medium may be communicated toelectronic device 100 via communications component 106 and/orassembly 200 via communications component 206). The computer-readable medium may embody computer-readable code, instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. A modulated data signal may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. - It is to be understood that each process may be enabled by any suitable software construct, firmware construct, one or more hardware components, or a combination thereof. For example, each process may be described in the general context of computer-executable instructions, such as program modules, that may be executed by one or more computers or other devices. Generally, a program module may include one or more routines, programs, objects, components, and/or data structures that may perform one or more particular tasks or that may implement one or more particular abstract data types. It is also to be understood that the number, configuration, functionality, and interconnection of program modules of
system 1 may be of any suitable architecture. - Many alterations and modifications of the preferred embodiments will no doubt become apparent to a person of ordinary skill in the art after having read the foregoing description, it is to be understood that the particular embodiments shown and described by way of illustration are in no way intended to be considered limiting. Thus, references to the details of the described embodiments are not intended to limit their scope.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/714,348 US10496187B2 (en) | 2016-09-23 | 2017-09-25 | Domed orientationless input assembly for controlling an electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662398939P | 2016-09-23 | 2016-09-23 | |
US15/714,348 US10496187B2 (en) | 2016-09-23 | 2017-09-25 | Domed orientationless input assembly for controlling an electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180088686A1 true US20180088686A1 (en) | 2018-03-29 |
US10496187B2 US10496187B2 (en) | 2019-12-03 |
Family
ID=61687969
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/714,348 Expired - Fee Related US10496187B2 (en) | 2016-09-23 | 2017-09-25 | Domed orientationless input assembly for controlling an electronic device |
Country Status (1)
Country | Link |
---|---|
US (1) | US10496187B2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180353054A1 (en) * | 2017-06-13 | 2018-12-13 | Sony Olympus Medical Solutions Inc. | Medical imaging apparatus |
US10359849B2 (en) * | 2015-04-14 | 2019-07-23 | Jose Antonio DELMAR LISSA | Portable communication device for transmitting touch-generated messages |
US10496187B2 (en) * | 2016-09-23 | 2019-12-03 | Apple Inc. | Domed orientationless input assembly for controlling an electronic device |
US10915184B1 (en) * | 2020-01-10 | 2021-02-09 | Pixart Imaging Inc. | Object navigation device and object navigation method |
CN112650400A (en) * | 2019-10-09 | 2021-04-13 | 东友科技股份有限公司 | Mouse and shell with flexible curved surface |
US11132070B1 (en) * | 2021-05-25 | 2021-09-28 | Arkade, Inc. | Computer input devices with hybrid translation modes |
US11736358B2 (en) * | 2017-01-20 | 2023-08-22 | Transform Sr Brands Llc | Interfacing event detectors with a network interface |
WO2024064938A1 (en) * | 2022-09-22 | 2024-03-28 | Apple Inc. | Input device for three-dimensional control |
WO2024064933A1 (en) * | 2022-09-22 | 2024-03-28 | Apple Inc. | Input device with adaptive grip orientation |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6532631B1 (en) * | 2018-10-15 | 2019-06-19 | 三菱電機株式会社 | Touch panel input device, touch panel input method, and program |
Citations (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD247746S (en) * | 1976-10-22 | 1978-04-18 | Atari, Inc. | Hand held control unit |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US20070080945A1 (en) * | 2002-05-28 | 2007-04-12 | Apple Computer, Inc. | Mouse having a button-less panning and scrolling switch |
US7233318B1 (en) * | 2002-03-13 | 2007-06-19 | Apple Inc. | Multi-button mouse |
US20070152966A1 (en) * | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Mouse with optical sensing surface |
US20070247439A1 (en) * | 2004-05-18 | 2007-10-25 | Daniel Simon R | Spherical Display and Control Device |
US20080150898A1 (en) * | 2002-09-09 | 2008-06-26 | Apple, Inc. | Mouse having an optically-based scrolling feature |
US20080297478A1 (en) * | 2003-09-02 | 2008-12-04 | Steve Hotelling | Ambidextrous Mouse |
US20100201626A1 (en) * | 2005-06-03 | 2010-08-12 | Krah Christoph H | Mouse with Improved Input Mechanisms Using Touch Sensors |
US20100295787A1 (en) * | 2009-05-20 | 2010-11-25 | Sheng-Kai Tang | Ergonomic adaptive mouse without orientation limitation |
US20120068927A1 (en) * | 2005-12-27 | 2012-03-22 | Timothy Poston | Computer input device enabling three degrees of freedom and related input and feedback methods |
US20120162073A1 (en) * | 2010-12-28 | 2012-06-28 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
US20130063350A1 (en) * | 2010-02-03 | 2013-03-14 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
US20130241894A1 (en) * | 1999-12-06 | 2013-09-19 | Elo Touch Solutions, Inc. | Processing signals to determine spatial positions |
US20130307676A1 (en) * | 2012-05-15 | 2013-11-21 | Stmicroelectronics (Research & Development) Limited | Remote control with multiple pointing devices in different planes |
US20140062892A1 (en) * | 2012-08-28 | 2014-03-06 | Motorola Mobility Llc | Systems and Methods for A Wearable Touch-Sensitive Device |
US20140292689A1 (en) * | 2013-03-27 | 2014-10-02 | Sony Corporation | Input device, input method, and recording medium |
US20140362025A1 (en) * | 2013-06-11 | 2014-12-11 | Thomson Licensing | Spherical remote control |
US20150070278A1 (en) * | 2013-09-09 | 2015-03-12 | Synaptics Incorporated | Device and method for disambiguating button presses on a capacitive sensing mouse |
US20150169080A1 (en) * | 2013-12-18 | 2015-06-18 | Samsung Electronics Co., Ltd. | Electronic device using auxiliary input device and operating method thereof |
US20160188010A1 (en) * | 2013-02-13 | 2016-06-30 | Apple Inc. | Force sensing mouse |
US20160231819A1 (en) * | 2015-02-11 | 2016-08-11 | Avaya Inc. | Wearable system input device |
US20170038905A1 (en) * | 2014-04-21 | 2017-02-09 | Apple Inc. | Apportionment of Forces for Multi-Touch Input Devices of Electronic Devices |
US20180074639A1 (en) * | 2016-09-14 | 2018-03-15 | Microsoft Technology Licensing, Llc | Touch-display accessory with relayed display plane |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11134093A (en) | 1997-10-27 | 1999-05-21 | As Interactive:Kk | Input device |
US10496187B2 (en) * | 2016-09-23 | 2019-12-03 | Apple Inc. | Domed orientationless input assembly for controlling an electronic device |
-
2017
- 2017-09-25 US US15/714,348 patent/US10496187B2/en not_active Expired - Fee Related
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD247746S (en) * | 1976-10-22 | 1978-04-18 | Atari, Inc. | Hand held control unit |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US20130241894A1 (en) * | 1999-12-06 | 2013-09-19 | Elo Touch Solutions, Inc. | Processing signals to determine spatial positions |
US7233318B1 (en) * | 2002-03-13 | 2007-06-19 | Apple Inc. | Multi-button mouse |
US20070080945A1 (en) * | 2002-05-28 | 2007-04-12 | Apple Computer, Inc. | Mouse having a button-less panning and scrolling switch |
US8314773B2 (en) * | 2002-09-09 | 2012-11-20 | Apple Inc. | Mouse having an optically-based scrolling feature |
US20080150898A1 (en) * | 2002-09-09 | 2008-06-26 | Apple, Inc. | Mouse having an optically-based scrolling feature |
US7808479B1 (en) * | 2003-09-02 | 2010-10-05 | Apple Inc. | Ambidextrous mouse |
US20080297478A1 (en) * | 2003-09-02 | 2008-12-04 | Steve Hotelling | Ambidextrous Mouse |
US7755605B2 (en) * | 2004-05-18 | 2010-07-13 | Simon Daniel | Spherical display and control device |
US20070247439A1 (en) * | 2004-05-18 | 2007-10-25 | Daniel Simon R | Spherical Display and Control Device |
US20100201626A1 (en) * | 2005-06-03 | 2010-08-12 | Krah Christoph H | Mouse with Improved Input Mechanisms Using Touch Sensors |
US20120068927A1 (en) * | 2005-12-27 | 2012-03-22 | Timothy Poston | Computer input device enabling three degrees of freedom and related input and feedback methods |
US8077147B2 (en) * | 2005-12-30 | 2011-12-13 | Apple Inc. | Mouse with optical sensing surface |
US20120075255A1 (en) * | 2005-12-30 | 2012-03-29 | Krah Christoph H | Mouse with optical sensing surface |
US20070152966A1 (en) * | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Mouse with optical sensing surface |
US9152236B2 (en) * | 2007-10-24 | 2015-10-06 | Panasonic Intellectual Property Management Co., Ltd. | Apparatus for remotely controlling another apparatus and having self-orientating capability |
US20140333562A1 (en) * | 2007-10-24 | 2014-11-13 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
US20100295787A1 (en) * | 2009-05-20 | 2010-11-25 | Sheng-Kai Tang | Ergonomic adaptive mouse without orientation limitation |
US8698748B2 (en) * | 2009-05-20 | 2014-04-15 | Asustek Computer Inc. | Ergonomic adaptive mouse without orientation limitation |
US20130063350A1 (en) * | 2010-02-03 | 2013-03-14 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
US20120162073A1 (en) * | 2010-12-28 | 2012-06-28 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
US8823645B2 (en) * | 2010-12-28 | 2014-09-02 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
US20130307676A1 (en) * | 2012-05-15 | 2013-11-21 | Stmicroelectronics (Research & Development) Limited | Remote control with multiple pointing devices in different planes |
US20140062892A1 (en) * | 2012-08-28 | 2014-03-06 | Motorola Mobility Llc | Systems and Methods for A Wearable Touch-Sensitive Device |
US20160188010A1 (en) * | 2013-02-13 | 2016-06-30 | Apple Inc. | Force sensing mouse |
US20140292689A1 (en) * | 2013-03-27 | 2014-10-02 | Sony Corporation | Input device, input method, and recording medium |
US20140362025A1 (en) * | 2013-06-11 | 2014-12-11 | Thomson Licensing | Spherical remote control |
US9176602B2 (en) * | 2013-06-11 | 2015-11-03 | Thomson Licensing | Spherical remote control |
US20150070278A1 (en) * | 2013-09-09 | 2015-03-12 | Synaptics Incorporated | Device and method for disambiguating button presses on a capacitive sensing mouse |
US20150169080A1 (en) * | 2013-12-18 | 2015-06-18 | Samsung Electronics Co., Ltd. | Electronic device using auxiliary input device and operating method thereof |
US10198172B2 (en) * | 2013-12-18 | 2019-02-05 | Samsung Electronics Co., Ltd. | Electronic device using auxiliary input device and operating method thereof |
US20170038905A1 (en) * | 2014-04-21 | 2017-02-09 | Apple Inc. | Apportionment of Forces for Multi-Touch Input Devices of Electronic Devices |
US20160231819A1 (en) * | 2015-02-11 | 2016-08-11 | Avaya Inc. | Wearable system input device |
US20180074639A1 (en) * | 2016-09-14 | 2018-03-15 | Microsoft Technology Licensing, Llc | Touch-display accessory with relayed display plane |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10359849B2 (en) * | 2015-04-14 | 2019-07-23 | Jose Antonio DELMAR LISSA | Portable communication device for transmitting touch-generated messages |
US10496187B2 (en) * | 2016-09-23 | 2019-12-03 | Apple Inc. | Domed orientationless input assembly for controlling an electronic device |
US11736358B2 (en) * | 2017-01-20 | 2023-08-22 | Transform Sr Brands Llc | Interfacing event detectors with a network interface |
US20180353054A1 (en) * | 2017-06-13 | 2018-12-13 | Sony Olympus Medical Solutions Inc. | Medical imaging apparatus |
US10986981B2 (en) * | 2017-06-13 | 2021-04-27 | Sony Olympus Medical Solutions Inc. | Medical imaging apparatus with rotatable manipulation ring |
US11889977B2 (en) | 2017-06-13 | 2024-02-06 | Sony Olympus Medical Solutions Inc. | Medical imaging apparatus with rotatable manipulation ring |
CN112650400A (en) * | 2019-10-09 | 2021-04-13 | 东友科技股份有限公司 | Mouse and shell with flexible curved surface |
US10915184B1 (en) * | 2020-01-10 | 2021-02-09 | Pixart Imaging Inc. | Object navigation device and object navigation method |
CN113110750A (en) * | 2020-01-10 | 2021-07-13 | 原相科技股份有限公司 | Object navigation device and object navigation method |
US11567588B2 (en) | 2021-05-25 | 2023-01-31 | Arkade, Inc. | Computer input devices with hybrid translation modes |
US11132070B1 (en) * | 2021-05-25 | 2021-09-28 | Arkade, Inc. | Computer input devices with hybrid translation modes |
WO2024064938A1 (en) * | 2022-09-22 | 2024-03-28 | Apple Inc. | Input device for three-dimensional control |
WO2024064933A1 (en) * | 2022-09-22 | 2024-03-28 | Apple Inc. | Input device with adaptive grip orientation |
Also Published As
Publication number | Publication date |
---|---|
US10496187B2 (en) | 2019-12-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10496187B2 (en) | Domed orientationless input assembly for controlling an electronic device | |
US11429232B2 (en) | Wearable electronic devices having an inward facing input device and methods of use thereof | |
US11275456B2 (en) | Finger-wearable input assembly for controlling an electronic device | |
US10514780B2 (en) | Input device | |
US8570273B1 (en) | Input device configured to control a computing device | |
EP3164785B1 (en) | Wearable device user interface control | |
US20140198130A1 (en) | Augmented reality user interface with haptic feedback | |
US20190114075A1 (en) | Electronic device and method for executing function using input interface displayed via at least portion of content | |
JP2017531246A (en) | Handedness detection from touch input | |
JP2006340370A (en) | Input device by fingertip-mounting sensor | |
KR101215915B1 (en) | Handheld electronic device with motion-controlled cursor | |
US10540023B2 (en) | User interface devices for virtual reality system | |
US20190094996A1 (en) | Systems and related methods for facilitating pen input in a virtual reality environment | |
CN113396378A (en) | System and method for a multipurpose input device for two-dimensional and three-dimensional environments | |
EP2808774A2 (en) | Electronic device for executing application in response to user input | |
KR102527901B1 (en) | Input apparatus in electronic device and control method thereof | |
Nguyen et al. | 3DTouch: A wearable 3D input device for 3D applications | |
US20150177947A1 (en) | Enhanced User Interface Systems and Methods for Electronic Devices | |
US20140049469A1 (en) | External support system for mobile devices | |
JP6690722B2 (en) | User interface device | |
US20150177783A1 (en) | Detachable device case having an auxiliary touch input device and data handling capability | |
JP2016118947A (en) | Spatial handwriting input system using angle-adjustable virtual plane | |
KR20200101214A (en) | Electronic device for identifying coordinate of external object touching touch sensor | |
US20240103656A1 (en) | Multi-mode mouse | |
WO2021061249A1 (en) | Finger-wearable input assembly for controlling an electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZUBER, WESLEY W.;MATLICK, JACOB L.;HUPPI, BRIAN Q.;SIGNING DATES FROM 20170922 TO 20170925;REEL/FRAME:043682/0950 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20231203 |