Misplaced Pages

Head-up display: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 01:33, 6 March 2012 editTagremover (talk | contribs)4,797 edits rv: use inline-tags for OR. And stop removing reliable, informative references← Previous edit Revision as of 01:56, 6 March 2012 edit undoTagremover (talk | contribs)4,797 editsNo edit summaryNext edit →
Line 1: Line 1:
{{About|the military and vehicle technology|its use in gaming|HUD (video gaming)}} {{About|the military and vehicle technology|its use in gaming|HUD (video gaming)}}
] ]] ] ]]
A '''head-up display''' (formally<ref></ref>) or '''heads-up display''' (])&mdash;also known as a '''HUD'''&mdash;is any transparent display that presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from a pilot being able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments. A '''head-up display''' or '''heads-up display''',<ref></ref> also known as a '''HUD'''&mdash;is any transparent display that presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from a pilot being able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments.


Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles, and other applications. Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles, and other applications.

Revision as of 01:56, 6 March 2012

This article is about the military and vehicle technology. For its use in gaming, see HUD (video gaming).
File:HUD view.jpg
HUD of an F/A-18C

A head-up display or heads-up display, also known as a HUD—is any transparent display that presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from a pilot being able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments.

Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles, and other applications.

Overview

HUD mounted in a PZL TS-11 Iskra jet trainer aircraft with a glass plate combiner and a convex collimating lens just below it

A typical HUD contains three primary components: a projector unit, a combiner, and a video generation computer.

The projection unit in a typical HUD is an optical collimator setup: a convex lens or concave mirror with a Cathode Ray Tube, light emitting diode, or liquid crystal display at its focus. This setup (a design that has been around since the invention of the reflector sight in 1900) produces an image where the light is parallel i.e. perceived to be at infinity.

The combiner is typically an angled flat piece of glass (a beam splitter) located directly in front of the viewer, that redirects the projected image from projector in such a way as to see the field of view and the projected infinity image at the same time. Combiners may have special coatings that reflect the monochromatic light projected onto it from the projector unit while allowing all other wavelengths of light to pass through. In some optical layouts combiners may also have a curved surface to refocus the image from the projector.

The computer provides the interface between the HUD (i.e. the projection unit) and the systems/data to be displayed and generates the imagery and symbology to be displayed by the projection unit .

Types

Other than fixed mounted HUDs there are also Helmet mounted displays (HMD), a form of HUD that features a display element that moves with the orientation of the users' heads.

Many modern fighters (such as F/A-18, F-22, Eurofighter) use both a HUD and HMD concurrently. The F-35 Lightning II was designed without a HUD, relying solely on the HMD, making it the first modern military fighter not to have a fixed HUD.

Generations

HUDs are split into four generations reflecting the technology used to generate the images.

  • First Generation—Use a CRT to generate an image on a phosphor screen, having the disadvantage of the phosphor screen coating degrading over time. The majority of HUDs in operation today are of this type.
  • Second Generation—Use a solid state light source, for example LED, which is modulated by an LCD screen to display an image. These systems do not fade or require the high voltages of first generation systems. These systems are on commercial aircraft.
  • Third Generation—Use optical waveguides to produce images directly in the combiner rather than use a projection system.
  • Fourth Generation—Use a scanning laser to display images and even video imagery on a clear transparent medium.

Newer micro-display imaging technologies are being introduced, including liquid crystal display (LCD), liquid crystal on silicon (LCoS), digital micro-mirrors (DMD), and organic light-emitting diode (OLED).

History

Longitudinal cross-section of a basic reflector sight (1937 German Revi C12/A).
Copilot's HUD of a C-130J

HUDs evolved from the reflector sight, a pre-World War II parallax free optical sight technology for military fighter aircraft. The first sight to add rudimentary information to the reflector sight was the gyro gunsight that projected an air speed and turn rate modified reticle to aid in leading the guns to hit a moving target (deflection aircraft gun aiming). As these sights advanced, more (and more complex) information was added. HUDs soon displayed computed gunnery solutions, using aircraft information such as airspeed and angle of attack, thus greatly increasing the accuracy pilots could achieve in air to air battles. An early example of what would now be termed a head-up display was the Projector System of the British AI Mk VIII air interception radar fitted to some de Havilland Mosquito night fighters, where the radar display was projected onto the aircraft's windscreen along with the artificial horizon, allowing the pilots to perform interceptions without taking their eyes from the windscreen.

In 1955 the US Navy's Office of Naval Research and Development did some research with a mock HUD concept unit along with a sidestick controller in an attempt to ease the pilots burden flying modern jet aircraft and make the instrumentation less complicated during flight. While their research was never incorporated in any aircraft at that time, the crude HUD mockup they built had all the features of today's modern HUD units.

HUD technology was next advanced in the Buccaneer, the prototype of which first flew on 30 April 1958. The aircraft's design called for an attack sight that would provide navigation and weapon release information for the low level attack mode. There was fierce competition between supporters of the new HUD design and supporters of the old electro-mechanical gunsight, with the HUD being described as a radical, even foolhardy option. The Air Arm branch of the Ministry sponsored the development of a Strike Sight. The Royal Aircraft Establishment (RAE) designed the equipment, it was built by Cintel, and the system was first integrated in 1958. The Cintel HUD business was taken over by Elliott Flight Automation and the Buccaneer HUD was manufactured and further developed continuing up to a Mark III version with a total of 375 systems made; it was given a `fit and forget' title by the Royal Navy and it was still in service nearly 25 years later. BAE Systems thus has a claim to the world's first Head Up Display in operational service.

In the United Kingdom, it was soon noted that pilots flying with the new gun-sights were becoming better at piloting their aircraft. At this point, the HUD expanded its purpose beyond weapon aiming to general piloting. In the 1960s, French test-pilot Gilbert Klopfstein created the first modern HUD and a standardized system of HUD symbols so that pilots would only have to learn one system and could more easily transition between aircraft. The modern HUD used in instrument flight rules approaches to landing was developed in 1975. Klopfstein pioneered HUD technology in military fighter jets and helicopters, aiming to centralize critical flight data within the pilot's field of vision. This approach sought to increase the pilot's scan efficiency and reduce "task saturation" and information overload.

Use of HUDs then expanded beyond military aircraft. In the 1970s, the HUD was introduced to commercial aviation, and in 1988, the Oldsmobile Cutlass Supreme became the first production car with a head-up display.

Until a few years ago, the Embraer 190 and Boeing 737 New Generation Aircraft (737-600,700,800, and 900 series) were the only commercial passenger aircraft available with HUDs. However, the technology is becoming more common with aircraft such as the Canadair RJ, Airbus A318 and several business jets featuring the displays. HUDs have become standard equipment on the Boeing 787. Furthermore, the Airbus A320, A330, A340 and A380 families are currently undergoing the certification process for a HUD. HUDs are also added to the Space Shuttle orbiter.

Design factors

There are several factors that interplay in the design of a HUD:

  • Field of View — also "FOV", indicates the angle(s), vertically as well as horizontally, subtended at the pilot's eye, that the combiner displays symbology in relation to the outside view. A narrow FOV means that the view (of a runway, for example) through the combiner might include little additional information beyond the perimeters of the runway environment; whereas a wide FOV would allow a 'broader' view. For aviation applications, the major benefit of a wide FOV is that an aircraft approaching the runway in a crosswind might still have the runway in view through the combiner, even though the aircraft is pointed well away from the runway threshold; where a narrow FOV the runway would be 'off the edge' of the combiner, out of the HUD's view. Because the human eyes are separated, each eye receives a different image. The HUD image is viewable by one or both eyes, depending on technical and budget limitations in the design process. Modern expectations are that both eyes view the same image, in other words a "binocular Field of View (FOV)".
  • Collimation - The projected image is collimated which makes the light rays parallel. Because the human brain interprets light parallelity to infer distance to an object, collimated images on the HUD combiner are perceived as existing at or near optical infinity. This means that the pilot's eyes do not need to refocus to view the outside world and the HUD display...the image appears to be "out there", overlaying the outside world.
  • Eyebox — The optical collimator produces a cylinder of parallel light so the display can only be viewed while the viewers' eyes are somewhere within that cylinder, a three-dimensional area called the head motion box or eyebox. Modern HUD eyeboxes are usually about 5 lateral by 3 vertical by 6 longitudinal inches. This allows the viewer some freedom of head movement but movement too far up/down left/right will cause the display to vanish off the edge of the collimator and movement to far back will cause it to crop off around the edge (vignette). The pilots is able to view the entire display as long as one of their eyes is inside the eyebox.
  • Luminance/contrast — Displays have adjustments in luminance and contrast to account for ambient lighting, which can vary widely (e.g., from the glare of bright clouds to moonless night approaches to minimally lit fields).
  • Boresight — Aircraft HUD components are very accurately aligned with the aircraft's three axes – a process called boresighting – so that displayed data conforms to reality typically with an accuracy of ±7.0 milliradians. In this case the word "conform" means, "when an object is projected on the combiner and the actual object is visible, they will be aligned". This allows the display to show the pilot exactly where the artificial horizon is, as well as the aircraft's projected path with great accuracy. When Enhanced Vision is used, for example, the display of runway lights are aligned with the actual runway lights when the real lights become visible. Boresighting is done during the aircraft's building process and can also be performed in the field on many aircraft.
  • Scaling - The displayed image (flight path, pitch and yaw scaling, etc.), are scaled to present to the pilot a picture that overlays the outside world in an exact 1:1 relationship. For example, objects (such as a runway threshold) that are 3 degrees below the horizon as viewed from the cockpit must appear at the -3 degree index on the HUD display.
  • Compatibility — HUD components are designed to be compatible with other avionics, displays, etc.

Aircraft

On aircraft avionics systems, HUDs typically operate from dual independent redundant computer systems. They receive input directly from the sensors (pitot-static, gyroscopic, navigation, etc.) aboard the aircraft and perform their own computations rather than receiving previously computed data from the flight computers. On other aircraft (the Boeing 787, for example) the HUD guidance computation for Low Visibility Take-off (LVTO) and low visibility approach comes from the same flight guidance computer that drives the autopilot. Computers are integrated with the aircraft's systems and allow connectivity onto several different data buses such as the ARINC 429, ARINC 629, and MIL-STD-1553.

Displayed data

Typical aircraft HUDs display airspeed, altitude, a horizon line, heading, turn/bank and slip/skid indicators. These instruments are the minimum required by 14 CFR Part 91.

Other symbols and data are also available in some HUDs:

  • boresight or waterline symbol—is fixed on the display and shows where the nose of the aircraft is actually pointing.
  • flight path vector (FPV) or velocity vector symbol—shows where the aircraft is actually going, the sum of all forces acting on the aircraft. For example, if the aircraft is pitched up but is losing energy, then the FPV symbol will be below the horizon even though the boresight symbol is above the horizon. During approach and landing, a pilot can fly the approach by keeping the FPV symbol at the desired descent angle and touchdown point on the runway.
  • acceleration indicator or energy cue—typically to the left of the FPV symbol, it is above it if the aircraft is accelerating, and below the FPV symbol if decelerating.
  • angle of attack indicator—shows the wing's angle relative to the airflow, often displayed as "α".
  • navigation data and symbols—for approaches and landings, the flight guidance systems can provide visual cues based on navigation aids such as an Instrument Landing System or augmented Global Positioning System such as the Wide Area Augmentation System. Typically this is a circle which fits inside the flight path vector symbol. Pilots can fly along the correct flight path by "flying to" the guidance cue.

Since being introduced on HUDs, both the FPV and acceleration symbols are becoming standard on head-down displays (HDD). The actual form of the FPV symbol on an HDD is not standardized but is usually a simple aircraft drawing, such as a circle with two short angled lines, (180 ± 30 degrees) and "wings" on the ends of the descending line. Keeping the FPV on the horizon allows the pilot to fly level turns in various angles of bank.

Military aircraft specific applications

FA-18 HUD while engaged in a mock dogfight

In addition to the generic information described above, military applications include weapons system and sensor data such as:

  • target designation (TD) indicator—places a cue over an air or ground target (which is typically derived from radar or inertial navigation system data).
  • Vc—closing velocity with target.
  • Range—to target, waypoint, etc.
  • Launch Acceptability Region (LAR)—displays when an air-to-air or air-to-ground weapon can be successfully launched to reach a specified target.
  • weapon seeker or sensor line of sight—shows where a seeker or sensor is pointing.
  • weapon status—includes type and number of weapons selected, available, arming, etc.

VTOL/STOL approaches and landings

During the 1980s, the military tested the use of HUDs in vertical take off and landings (VTOL) and short take off and landing (STOL) aircraft. A HUD format was developed at NASA Ames Research Center to provide pilots of V/STOL aircraft with complete flight guidance and control information for Category-IIIC terminal-area flight operations. This includes a large variety of flight operations, from STOL flights on land-based runways to VTOL operations on aircraft carriers. The principal features of this display format are the integration of the flightpath and pursuit guidance information into a narrow field of view, easily assimilated by the pilot with a single glance, and the superposition of vertical and horizontal situation information. The display is a derivative of a successful design developed for conventional transport aircraft.

Civil aircraft specific applications

The cockpit of NASA's Gulfstream GV with a synthetic vision system display. The HUD combiner is in front of the pilot (with a projector mounted above it). This combiner uses a curved surface to focus the image.

The use of head-up displays allows commercial aircraft substantial flexibility in their operations. Systems have been approved which allow reduced-visibility takeoffs and landings, as well as full Category IIIc landings. Studies have shown that the use of a HUD during landings decreases the lateral deviation from centerline in all landing conditions, although the touchdown point along the centerline is not changed.

Enhanced flight vision systems

In more advanced systems, such as the FAA-labeled Enhanced Flight Vision System, a real-world visual image can be overlaid onto the combiner. Typically an infrared camera (either single or multi-band) is installed in the nose of the aircraft to display a conformed image to the pilot. EVS Enhanced Vision System is an industry accepted term which the FAA decided not to use because "the FAA believes could be confused with the system definition and operational concept found in 91.175(l) and (m)" In one EVS installation, the camera is actually installed at the top of the vertical stabilizer rather than "as close as practical to the pilots eye position". When used with a HUD however, the camera must be mounted as close as possible to the pilots eye point as the image is expected to "overlay" the real world as the pilot looks through the combiner.

"Registration," or the accurate overlay of the EVS image with the real world image, is one feature closely examined by authorities prior to approval of a HUD based EVS. This is because of the importance of the HUD matching the real world.

While the EVS display can greatly help, the FAA has only relaxed operating regulations so an aircraft with EVS can perform a CATEGORY I approach to CATEGORY II minimums. In all other cases the flight crew must comply with all "unaided" visual restrictions. (For example if the runway visibility is restricted because of fog, even though EVS may provide a clear visual image it is not appropriate (or actually legal) to maneuver the aircraft using only the EVS below 100' agl.)

Synthetic vision systems

A synthetic vision system display

HUD systems are also being designed to utilize a synthetic vision system (SVS), which use terrain databases to create realistic and intuitive views of the outside world.

In the SVS image to the right, immediately visible indicators include the airspeed tape on the left, altitude tape on the right, and turn/bank/slip/skid displays at the top center. The boresight symbol (-v-) is in the center and directly below that is the flight path vector symbol (the circle with short wings and a vertical stabilizer). The horizon line is visible running across the display with a break at the center, and directly to the left are the numbers at ±10 degrees with a short line at ±5 degrees (The +5 degree line is easier to see) which, along with the horizon line, show the pitch of the aircraft.

The aircraft in the image is wings level (i.e. the flight path vector symbol is flat relative to the horizon line and there is zero roll on the turn/bank indicator). Airspeed is 140 knots, altitude is 9450 feet, heading is 343 degrees (the number below the turn/bank indicator). Close inspection of the image shows a small purple circle which is displaced from the Flight Path Vector slightly to the lower right. This is the guidance cue coming from the Flight Guidance System. When stabilized on the approach, this purple symbol should be centered within the FPV.

The terrain is entirely computer generated from a high resolution terrain database.

In some systems, the SVS will calculate the aircraft's current flight path, or possible flight path (based on an aircraft performance model, the aircraft's current energy, and surrounding terrain) and then turn any obstructions red to alert the flight crew. Such a system could have prevented the crash of American Airlines Flight 965 in 1995.

On the left side of the display is an SVS-unique symbol, with the appearance of a purple, dimishing sideways ladder, and which continues on the right of the display. The two lines define a "tunnel in the sky". This symbol defines the desired trajectory of the aircraft in three dimensions. For example, if the pilot had selected an airport to the left, then this symbol would curve off to the left and down. If the pilot keeps the flight path vector alongside the trajectory symbol, the craft will fly the optimum path. This path would be based on information stored in the Flight Management System's data base and would show the FAA-approved approach for that airport.

The tunnel in the sky can also greatly assist the pilot when more precise four dimensional flying is required, such as the decreased vertical or horizontal clearance requirements of RNP. Under such conditions the pilot is given a graphical depiction of where the aircraft should be and where it should be going rather than the pilot having to mentally integrate altitude, airspeed, heading, energy and longitude and latitude to correctly fly the aircraft.

Automobiles

HUD in a BMW E60
HUD in a Pontiac Bonneville showing a speed of 47 mph

General Motors began using head-up displays in 1988 with the first color display appearing in 1998 on the Corvette C5. Toyota, for domestic market only, in 1991 released this system in Toyota Crown Majesta. In 2003, BMW became the first European manufacturer to offer HUDs. The displays are becoming increasingly available in production cars, and usually offer speedometer, tachometer, and navigation system displays. Night vision information is also displayed via HUD on certain General Motors, Honda, Toyota and Lexus vehicles. Other manufactures such as Audi, Citroën, Saab, and Nissan currently offer some form of HUD system. Motorcycle helmet HUDs are also commercially available.

Add-on HUD systems also exist, projecting the display onto a glass combiner mounted on the windshield. These systems have been marketed to police agencies for use with in-vehicle computers.

Developmental / experimental uses

HUDs have been proposed or are being experimentally developed for a number of other applications. In the military, a HUD can be used to overlay tactical information such as the output of a laser rangefinder or squadmate locations to infantrymen. A prototype HUD has also been developed that displays information on the inside of a swimmer's goggles or of a scuba diver's mask. A group of Electrical Engineering students from the University of Massachusetts Amherst are integrating technologies in order to develop an affordable Personal Head-Up Display. One such design is a HUD in skiing goggles.

HUD systems that project information directly onto the wearer's retina with a low-powered laser (virtual retinal display) are also in experimentation. This kind of head-up display has been common in science fiction movies for decades, notably in Terminator and RoboCop.

See also

References

  1. D. N. Jarrett, Cockpit engineering, page 189
  2. Spatial disorientation in aviation By Fred H. Previc, William R. Ercoline, page 452
  3. D. N. Jarrett, Cockpit engineering, page 189
  4. "Axis History Forum • View topic – RAF Fixed and Free-mounted Reflector Gunsights". Forum.axishistory.com. Retrieved 2009-12-08.
  5. "Windshield TV Screen To Aid Blind Flying." Popular Mechanics, March 1955, p. 101.
  6. Rochester Avionics Archives
  7. ^ Spitzer, Cary R., ed. "Digital Avionics Handbook". Head-Up Displays. Boca Raton, FL: CRC Press, 2001
  8. Pope, Stephen. "The Future of Head-Up Display Technology". Aviation International News. Jan. 2006. 1 August 2011
  9. ""Oldsmobiles Pace "the Race"" Oldsmobile Club of America. 2006. Accessed 12 February 2007". Oldsclub.org. 2009-02-09. Retrieved 2009-10-02.
  10. Norris, G.; Thomas, G.; Wagner, M. and Forbes Smith, C. (2005). Boeing 787 Dreamliner—Flying Redefined. Aerospace Technical Publications International. ISBN 0-9752341-2-9.{{cite book}}: CS1 maint: multiple names: authors list (link)
  11. "Airbus A318 approved for Head Up Display". Airbus.com. 2007-12-03. Archived from the original on December 7, 2007. Retrieved 2009-10-02.
  12. The avionics handbook By Cary R. Spitzer, section 4-7
  13. "14 CFR Part 91". Airweb.faa.gov. Retrieved 2009-10-02.
  14. ""Forces in a Climb" NASA Glenn Research Center". Grc.nasa.gov. 2008-07-11. Retrieved 2009-10-02.
  15. Merrick, Vernon K., Glenn G. Farris, and Andrejs A. Vangas. "A Head Up Display for Applicatoin to V/STOL Aircraft Approach and Landing". NASA Ames Research Center 1990.
  16. Order: 8700.1 Appendix: 3 Bulletin Type: Flight Standards Handbook Bulletin for General Aviation (HBGA) Bulletin Number: HBGA 99-16 Bulletin Title: Category III Authorization for Parts 91 and 125 Operators with Head-Up Guidance Systems (HGS); LOA and Operations Effective Date: 8-31-99
  17. Falcon 2000 Becomes First Business Jet Certified Category IIIA by JAA and FAA; Aviation Weeks Show News Online September 7, 1998
  18. "Design Guidance for a HUD System is contained in Draft Advisory Circular AC 25.1329-1X, "Approval of Flight Guidance Systems" dated 10/12/2004". Airweb.faa.gov. Retrieved 2009-10-02.
  19. HUD With a Velocity (Flight Path) Vector Reduces Lateral Error During Landing in Restricted Visibility; International Journal of Aviation Psychology, 2007, Vol. 17 No 1, pages 91–108
  20. ^ DOT Docket FAA-2003-14449-45—Enhanced Flight Vision Systems
  21. 14 CFR Part 91.175 change 281 "Takeoff and Landing under IFR"
  22. "Slide 1" (PDF). Archived from the original (PDF) on March 9, 2008. Retrieved 2009-10-02.
  23. For additional information see Evaluation of Alternate Concepts for Synthetic Vision Flight Displays with Weather-Penetrating Sensor Image Inserts During Simulated Landing Approaches, NASA/TP-2003-212643
  24. "No More Flying Blind, NASA". Nasa.gov. 2007-11-30. Retrieved 2009-10-02.
  25. "PowerPoint Presentation" (PDF). Archived from the original (PDF) on March 9, 2008. Retrieved 2009-10-02.
  26. "Head-Up Displays (HUDs) Put the Essentials in Your Line of Sight—Progressive Auto Insurance Articles & Blogs". Progressive.com. 2008-07-29. Retrieved 2009-10-02.
  27. "Mike, Werner. "Test Driving the SportVue Motorcycle HUD". Motorcycles in the Fast Lane. 8 November 2005. Accessed 14 February 2007". News.motorbiker.org. Retrieved 2009-10-02.
  28. By  Julie Clothier for CNN. "Clothier, Julie. "Smart Goggles Easy on the Eyes". CNN.Com. 27 June 2005. CNN. Accessed 22 February 2007". Edition.cnn.com. Retrieved 2009-10-02. {{cite web}}: |author= has generic name (help); no-break space character in |author= at position 3 (help)
  29. "Ivan A. Bercovich, Radu-A Ivan, Jeffrey Little, Felipe Vilas-Boas. "Personal Head-Up Display" University of Massachusetts Amherst. 11 December 2008". Ecs.umass.edu. Retrieved 2009-10-02.
  30. Panagiotis Fiambolis. ""Virtual Retinal Display (VRD) Technology". Virtual Retinal Display Technology. Naval Postgraduate School. 13 February 2007". Cs.nps.navy.mil. Archived from the original on April 13, 2008. Retrieved 2009-10-02.
  31. Lake, Matt (2001-04-26). "Lake, Matt. "How It Works: Retinal Displays Add a Second Data Layer". New York Times 26 April 2001. accessed 13 February 2006". Nytimes.com. Retrieved 2009-10-02.

External links

Automotive design
Part of a series of articles on cars
Body
Framework
Compartments
Doors
Glass
Other elements
Geometry
Exterior
equipment
Lighting
Other elements
Legal
Extended reality (XR)
Concepts
Technologies
Display
3D interaction
Software
Photography
Other
Peripherals
Companies
Devices
Current
Former
Unreleased
Software
General
Operating systems
Development tools
Games
Communities
People
In fiction
Emerging technologies
Categories: