Revision as of 06:09, 29 June 2004 edit64.136.201.127 (talk)No edit summary← Previous edit | Latest revision as of 21:43, 28 June 2006 edit undoK.lee (talk | contribs)Extended confirmed users935 editsm moved User:K.lee/Programming language rewrite to User:K.lee/Archived local PL thingie: Don't want search engines to return this article for "programming language rewrite", since rewrite is actually important terminology in PL theory. | ||
(97 intermediate revisions by 46 users not shown) | |||
Line 1: | Line 1: | ||
'''Note:''' This article |
'''Note:''' This article was once intended to someday replace ]. Another user helped it go live, so it's no longer needed. I am leaving the page here to preserve the edit history, but I have blanked its contents to prevent it from appearing in search engine results. ] 20:41, 25 June 2006 (UTC) | ||
---- | |||
A '''programming language''' is a systematic method to describe ]s. Programming languages are most commonly used to describe computations to ] ]s. Programming languages exist because other forms of human expression such as ]s are not well-suited to describing computation. | |||
Computations expressed in programming languages are generally called '''programs'''. The human-readable description of a program written in a programming language is called '']'', which is changed by a specialized computer program called a '']'' or an '']'' into ] which can be executed directly by the ]. The act of constructing programs is called ]. | |||
As descriptions of computations expressed in a programming language are frequently rather obtuse to other humans (and even to the original programmer after some time has elapsed), it is considered good programming practice to add ]s in some natural language to the source code. These comments are ignored by the computer, but are often critical to allowing another human to easily understand the code. | |||
==Why programming languages?== | |||
Like other specialized languages, such as ] and ] formulae, programming languages facilitate the communication of a specific kind of knowledge---namely, '''computation''', or the task of organizing and manipulating information. For example, a programming language might enable its user to express the following: | |||
* A step-by-step procedure for ] a list of names in ] order. | |||
* A set of precise rules for predicting the future motion of ]s in the ]. | |||
* A specific ] allowing computers to communicate over the ]. | |||
Programming languages differ from most other forms of human expression in that they force the author to write instructions with exceeding precision and completeness. In a natural language, or even mathematical notation, authors have considerable freedom to be vague and incomplete. For example, consider natural language: | |||
* Speakers can leave things out, because humans excel at "filling in the gaps" of a partial statement. If someone says: "Going to the store," the listener might use the context to fill in the missing words: "I am going to the store on the corner." | |||
* Speakers can make grammatical errors, because humans excel at compensating for minor errors in language. If someone says, "I am going to the store am the corner," the listener can usually tell that "am" is meant to be "on". | |||
By contrast, in a programming language, every part of every computation must be expressed explicitly and exactly. Each phrase in a program corresponds unambiguously to its literal meaning, and no more. If the author of a program states that the program should perform an incorrect step, the program's meaning will include that incorrect step. If the author omits a necessary step, the program's meaning will not include that step. Therefore, in order to write a "correct" program, the author must be correct in every detail. | |||
In return for this exacting discipline, programming languages reward the user with a unique power: many programming languages are ''executable'' by an electronic computer. In other words, tasks expressed in most programming languages can be performed autonomously by a computer, without human intervention. Therefore, programming languages have enormous practical utility — they enable the construction of programs that automatically perform tasks. The entire ] industry is built around the construction and use of programs. A ''']''' is a system that enables a computer to execute a program written in a programming language. Programming language implementations typically come in two flavors: ]s and ]s. | |||
However, not all programming languages have executable implementations. In particular, most '''foundational languages''' — also called '''core languages''' or '''calculi''' (singular ''']''') — are purely mathematical constructs. The most prominent foundational language is ]'s ]. ]s use such languages to study the fundamental properties of programming languages; often, language designers apply the lessons learned from formal calculi in the design of practical programming languages. For example, the lambda calculus is the basis of ]. See also ]. | |||
Since the dawn of programming languages, countless languages have been designed from scratch, altered to meet new needs, combined with other languages, and fallen into disuse. Although there have been attempts to make one "universal" computer language that serves all purposes, all of them have failed. The need for diverse computer languages arises from the diversity of contexts in which languages are used: | |||
* Programs range from tiny scripts written by individual hobbyists to huge systems written by hundreds of programmers. | |||
* Programmers range in expertise from novices (who need simplicity above all else) to experts (who may be comfortable with considerable complexity in the language). | |||
* Programs may need to extract the right amount of performance on platforms ranging from tiny microcontrollers to ]s. | |||
* Finally, programmers may simply differ in their tastes or habits. | |||
=== Note: Programming languages vs. computer languages === | |||
Programming languages are a subset of ''']s'''. Generally, a "programming language" is designed to express ''computation'', whereas the term "computer language" can denote any language used by a computer or its users. For example, a ] is a computer language, but generally markup languages are designed to describe primarily data, not computation. | |||
== Elements of programming languages == | |||
A programming language's surface form---that is, how programs are represented to a reader---is its '']''. Programming languages vary widely in this surface form. Most programming languages are textual---that is, they consist of sequences of "words" and "punctuation marks", much like written natural languages. However, some exotic programming languages are partly or entirely "visual"---such ''']''' use spatial relationships among stylized pictorial representations to communicate their meaning, much like musical notation and ]s. | |||
Regardless of their surface form, all programming languages also specify a ], or "meaning", for each ] construct in the language. In textual programming languages, the syntactic constructs are words and phrases; in visual languages, the "syntax" consists of spatial arrangements of pictures, text, or both. In either case, the essence of a programming language is the meaning that it assigns to the various constructs it provides. | |||
=== Syntax === | |||
The syntax of a language describes what sequences or arrangements of symbols is legal in the language. The syntax does not describe meaning; that is the job of semantics. Most commonly used languages are textual, so here we describe only the syntax of textual languages. | |||
Programming language syntax is usually defined using a combination of ]s (for ] structure) and ] (for ] structure). The use of these devices gives programming language syntax a precise definition founded in ] theory. For example, a simplified grammar for Pure ] is given by the following: | |||
expression ::= atom | list | |||
atom ::= number | symbol | |||
number ::= + | |||
symbol ::= .* | |||
list ::= '(' expression* ')' | |||
This indicates that: | |||
* an expression is an atom or a list; | |||
* an atom is a number or a symbol; | |||
* a number is one or more digits; | |||
* a symbol is a letter followed by zero or more of any characters; and | |||
* a list is a pair of parentheses, with any number of expressions inside it. | |||
The following are all expressions in this grammar: | |||
12345 | |||
() | |||
(a b c232 (1)) | |||
Not all syntactically correct programs are ''semantically'' correct---in other words, it is possible for a program to have the proper syntax, but mean nothing, or mean something incorrect. This is analogous to the fact that, in natural languages, a sentence may appear grammatically correct but have no meaning, or a false meaning. For example: | |||
* The following English sentence is grammatically well-formed but means nothing: "The grundlefarb timbles over the slothrop." | |||
* This English sentence is syntactically well-formed, and has a meaning, but its meaning is incorrect: "The ] on a ] field is usually purple." | |||
Therefore, it naturally arises that a programming language must have a ''semantics'', or "a set of rules about meaning", as well as its syntax. | |||
=== Semantics === | |||
Although programmers often notice the syntax of a programming language first, the essence of a programming language is its semantics. In fact, it is possible to have a single language with multiple syntaxes---the ] of ] and ], for example, has both a pictorial and a textual syntax, both exactly quivalent in meaning. | |||
==== Data and types ==== | |||
LEFT OFF HERE | |||
Internally, all data in a modern digital computer are stored simply as on-off (]) states. The data typically represent information in the real world such as names, bank accounts and measurements and so the low-level binary data are organised by programming languages into these high-level concepts. | |||
The particular system by which data are organized in a program is the ''type system'' of the programming language; the design and study of type systems is known as ]. Languages can be classified as ''statically typed'' systems (e.g. ] or ]), and ''dynamically typed'' languages (e.g. ], ], ] or ]). Statically-typed languages can be further subdivided into languages with manifest types, where each variable and function declaration has its type explicitly declared, and ''type-inferred'' languages (e.g. ], ]). | |||
With statically-typed languages, there usually are pre-defined types for individual pieces of data (such as numbers within a certain range, strings of letters, etc.), and programmatically named values (variables) can have only one fixed type, and allow only certain operations: numbers cannot change into names and vice versa. Dynamically-typed languages treat all data locations interchangeably, so inappropriate operations (like adding names, or sorting numbers alphabetically) will not cause errors until run-time. Type-inferred languages superficially treat all data as not having a type, but actually do sophisticated analysis of the way the program uses the data to determine which elementary operations are performed on the data, and therefore deduce what type the variables have at compile-time. Type-inferred languages can be more flexible to use, while creating more efficient programs; however, this capability is difficult to include in a programming language implementation, so it is relatively rare. | |||
It is possible to perform type inference on programs written in a dynamically-typed language, but it is legal to write programs in these languages that make type inference infeasible. | |||
Sometimes statically-typed languages are called "type-safe" or "strongly typed", and dynamically-typed languages are called "untyped" or "weakly typed"; confusingly, these same terms are also used to refer to the distinction between languages like ], ], ], ], or ], in which it is impossible to use a value as a value of another type and possibly corrupt data from an unrelated part of the program or cause the program to crash, and languages like ], ], ], ], and most implementations of ], in which it is possible to do this. | |||
Sometimes type-inferred and dynamically-typed languages are called "latently typed." | |||
Most languages also provide ways to assemble complex ]s from built-in types and to associate names with these new combined types (using arrays, lists, stacks, files). ] languages allow the programmer to define new data-types, "Objects", along with the "Functions" to operate upon these new data-types, "Methods", by assembling complex structures along with ''behaviors'' specific to those newly defined data structures. | |||
Aside from when and how the correspondence between expressions and types is determined, there's also the crucial question of what types the language defines at all, and what types it allows as the values of expressions (''expressed values'') and as named values (''denoted values''). | |||
Low-level languages like C typically allow programs to name memory locations, regions of memory, and compile-time constants, while allowing expressions to return values that fit into machine registers; ANSI C extended this by allowing expressions to return ] values as well. ] often allow variables to name run-time computed values directly instead of naming memory locations where values may be stored. Languages that use ] are free to allow arbitrarily complex ]s as both expressed and denoted values. Finally, in some languages, procedures are allowed only as denoted values (they cannot be returned by expressions or bound to new names); in others, they can be passed as parameters to routines, but cannot otherwise be bound to new names; in others, they are as freely usable as any expressed value, but new ones cannot be created at run-time; and in still others, they are first-class values that can be created at run-time. | |||
==== Control ==== | |||
Once data has been specified, the machine must be instructed how to perform operations on the data. Elementary statements may be specified using keywords or may be indicated using some well-defined grammatical structure. Each language takes units of these well-behaved statements and combines them using some ordering system. Depending on the language, differing methods of grouping these elementary statements exist. This allows one to write programs that are able to cover a variety of input, instead of being limited to a small number of cases. Furthermore, beyond the data manipulation instructions, other typical instructions in a language are those used for ](branches, definitions by cases, conditionals, loops, backtracking, functional composition). | |||
==== Naming, abstraction, and parameterization ==== | |||
The core of the idea of ''reference'' is that there must be a method of indirectly designating storage space. The most common method is through named variables. Depending on the language, further indirection may include references that are pointers to other storage space stored in such variables or groups of variables. Similar to this method of naming storage is the method of naming groups of instructions. Most programming language use ] calls, procedure calls or function calls as the statements that use these names. Using symbolic names in this way allows a program to achieve significant flexibility, as well as a high measure of reusability. Indirect references to available programs or predefined data divisions allow many application-oriented languages to integrate typical operations as if the programming language included them as higher level instructions. | |||
==== Specifying language semantics ==== | |||
There are six ways in which programming language semantics are described; all languages use at least one, and some languages combine more than one: | |||
* ]. | |||
* ]. | |||
* ]. | |||
* ] descriptions. | |||
* Reference implementations. | |||
* Test suites | |||
The first three of these are grounded in mathematics, and have the advantage of being precise, compact, and unambiguous. Programming languages whose semantics are described using one of these methods can reap many benefits. For example: | |||
* Formal semantics enable mathematical proofs of program correctness; | |||
* Formal semantics facilitate the design of ]s, and proofs about the soundness of those type systems; | |||
* Formal semantics establish unambiguous and uniform standards for implementations of the language, making it more likely that programs written in those languages will be ] across implementations. | |||
By contrast, natural language descriptions tend to be imprecise, verbose, and ambiguous. They do not lend themselves to proofs, either about individual programs or about the programming language's type system. On the other hand, it is relatively easy for inexperienced language designers to write a natural-language description of a programming language's semantics. Additionally, formulating a rigorous mathematical semantics of a large, complex, practical programming language is a daunting task even for experienced specialists, and the resulting specification can be difficult to understand except by a small priesthood of experts. | |||
To these objections, advocates of formal semantics reply that if a language is so complicated that a formal semantics cannot be defined for it, then a natural language description is likely to fare no better. Natural language description can always be defined as a ''supplement'' to a formal semantics. Also, formal semantics advocates point out that the imprecision of natural language as a vehicle for programming language semantics has caused problems in the real world: for example, the semantics of ] ] were specified in English, and it was later discovered that the specification did not provide adequate guidance for implementors. Writing truly portable multithreaded Java programs remains challenging to this day. | |||
Regardless of the relative merits of formal and natural-language semantics, in practice most widely-used languages are specified using natural language description. This description usually takes the form of a ''reference manual'' for the language. The manuals for widely used languages usually run in the hundreds of pages. For example, the print version of the ''Java Language Specification, 2nd Ed.'' is 544 pages long. | |||
By contrast, ''The Definition of Standard ML, Revised'', which uses operational semantics to describe ], is 114 pages long. The ''Revised<sup>5</sup> Report on the Scheme'' (R5RS) uses denotational semantics to describe ], and is 50 pages long. (These comparisons should be taken with the caveat that Scheme and ML are both arguably simpler languages than Java.) | |||
The fourth means of specifying language semantics is with a ''reference implementation''. In this approach, a single implementation of the programming language is designated as authoritative, and its behavior is held to define the proper behavior of a program written in this language. This approach has several attractive properties. First, it is precise, and requires no human interpretation: all disputes as to the meaning of a program can be settled simply and unambiguously by executing the program on this implementation. | |||
On the other hand, this approach also has several drawbacks. Chief among them is that it conflates limitations of the reference implementation with properties of the language. For example, if the reference implementation has a bug, then that bug must be considered to be an authoritative behavior. Another drawback is that programs written in this language are likely to rely on quirks in the reference implementation, hindering portability across different implementations. | |||
Nevertheless, several languages effectively take the reference implementation approach. For example, the ] interpreter is considered to define the authoritative behavior of Perl programs. In the case of Perl, nobody has ever produced an independent implementation of the language, and the Perl executable itself is highly portable, so some of the drawbacks of using a reference implementation to define the language semantics are moot. | |||
The final way of specifying the meaning of a language is with a ''test suite''. In this approach, the language designer writes a number of example programs in the language, and then describes how those programs ought to behave --- perhaps by writing down their correct outputs. The programs, plus their outputs, are called the "test suite" of the language. Any correct language implementation must then produce exactly the correct outputs on the test suite programs. | |||
This technique's chief advantage that it is easy to determine whether a language implementation is correct. The user can simply execute all the programs in the test suite, and compare the outputs to the desired outputs. If the outputs are the same, the language implementation is correct. If not, the implementation is incorrect. However, when used by itself, the test suite approach has major drawbacks as well. For example, users want to run their own programs, which are not part of the test suite; indeed, a language implementation that could ''only'' run the programs in its test suite would be largely useless. But a test suite does not, by itself, describe how the language implementation should behave on any program not in the test suite; determining that behavior requires some extrapolation on the implementor's part, and different implementors may disagree. | |||
Therefore, in common practice, test suites are used only in combination with one of the other language specification techniques, such as a natural language description or a reference implementation. | |||
== History of programming languages == | |||
=== Prehistory: Foundations and Assembly === | |||
The first foundational programming languages predate the modern computer entirely. The most important of these is the ''']''', invented in the ] by ] of ]. Alternatively, the instructions of the ], developed in the 1930s by ] (who studied with Church at Princeton), can be viewed as a "foundational ]". However, unlike the lambda calculus, Turing's code does not serve well as a foundation for higher-level languages---its principal use is in rigorous analyses of ] ]. It is well-suited to this task because every operation in a Turing machine takes ], whereas the lambda calculus's ''reduction''---its most fundamental operation---can take ] proportional the size of the function body. | |||
Like many "firsts" in computer history, the first non-foundational programming language is hard to identify. The designation depends on how much power and human-readability one requires of a form of communication before granting it the status of "programming language". ]s and ]'s ] both had simple, extremely limited languages for describing the actions that these machines should perform. One can even regard the punch holes on a ] scroll as a limited ], albeit one not fit for human consumption. | |||
However, these machine languages were not ]---that is, "programs" created in these languages would be fundamentally less powerful than a general-purpose computer, even if they were given infinite ] (see ]). The first Turing complete programming languages were probably the ]s of the earliest ] computers. In assembly languages, each construct in the language corresponds exactly to one instruction in a physical computer. For example, a machine might have an instruction that added the contents of two machine ]s, and placed the result in a third register; such an instruction might be represented in assembly language using the following text: | |||
ADD R1, R2, R3 | |||
It was soon discovered that programming in assembly language was extraordinarily tedious and error-prone. Individual machine instructions were too "low-level" to construct large programs---doing so was akin to constructing a building by gluing together individual grains of ]. As demand for more complex programs grew, programmers soon realized that higher-level programming languages were necessary. | |||
=== The 1950s: The Founding Three, and Algol === | |||
In the ], the first three modern programming languages were invented: | |||
* ], the "'''LIS'''t '''P'''rocessor", invented by ] et al.; | |||
* ], the "'''FOR'''mula '''TRAN'''slator, invented by ] et al.; and | |||
* ], the '''CO'''mmon '''B'''usiness '''O'''riented '''L'''anguage, created by the ], heavily influenced by ]. | |||
Descendants of all three are still in use today, although Lisp (which most people no longer write with all capitals) has influenced later languages to a much greater extent than the other two. | |||
The next major milestone in computer languages came between ] and ], when a committee composed jointly of American and European computer scientists held a series of meetings in Europe to design "a new language for algorithms". These meetings culminated in the ''] Report'', which described Algol 60, the "'''ALGO'''rithmic '''L'''anguage". This report consolidated many ideas then circulating in the languages community, and also featured two key innovations: | |||
* The use of ] (BNF) for describing the language's syntax. Nearly all subsequent programming languages have used a variant of BNF to describe the ] portion of their syntax. | |||
* The introduction of ] for names in arbitrarily nested scopes. | |||
Algol 60 was never widely used in North America---partly for political reasons, stemming from ]'s introduction of ], and partly due to the decision not to include I/O within the language definition. However, it represented a major step forward, and was influential in the education of a generation of computer scientists and the design of later languages, including ], ], ], and ]. Additionally, most modern computer science textbooks use an Algol-like notation for ] descriptions of algorithms. | |||
=== 1967-1978: Establishing Fundamental Paradigms === | |||
The period from the late ] to the late ] brought a major flowering of programming languages, comparable to the ] of ]. Most of the major language paradigms now in use were invented in this period: | |||
* ] was invented by ] and ] in ''']''', which reached maturity in ]. In ], ] followed Simula with a refinement of object-oriented concepts. | |||
* ''']''', an early ] language, was developed by ] and ] at ] in between ] and ]. | |||
* ''']''', designed in ] by ], ], and ], was the first ] language. | |||
* ''']''' built a polymorphic type system (invented by ] in ]) on top of Lisp, pioneering ] ] languages. | |||
Each of these languages spawned an entire family of descendants, and most modern languages count at least one of them in their ancestry. | |||
The 1960s and 1970s also saw considerable debate over the merits of "]", which essentially meant programming without the use of ]. This debate was closely related to language design: some languages did not include GOTO, which forced structured programming on the programmer. Although the debate raged hotly at the time, nearly all programmers now agree that, even in languages that provide GOTO, it is bad ] to use it except in rare circumstances. As a result, later generations of language designers have found the structured programming debate tedious and even bewildering: first, the answer seems obvious in retrospect; and second, there were far greater revolutions afoot in language design at the time. | |||
=== The 1980s: Consolidation, Modules, Performance === | |||
In contrast to the remarkable innovation of the previous decade, the ] were years of relative consolidation. ] combined object-oriented and systems programming. The United States government standardized ], a systems programming language intended for use by defense contractors. In Japan and elsewhere, vast sums were spent investigating so-called "fourth generation" languages that incorporated logic programming constructs. The functional languages community moved to standardize ML and Lisp. Rather than inventing new paradigms, all of these movements elaborated upon the ideas invented in the previous decade. | |||
However, one important new trend in language design was an increased focus on programming for large-scale systems through the use of ''modules'', or large-scale organizational units of code. ], Ada, and ML all developed notable module systems in the 1980s. Module systems were often wedded to ] constructs---generics being, in essence, parameterized modules (see also ]). | |||
Furthermore, although major new paradigms for programming languages did not appear, many researchers expanded on the ideas of prior languages and adapted them to new contexts. For example, the languages of the ] and ] systems adapted object-oriented programming to ]. | |||
The 1980s also brought great advances in ]. The ] movement in ] hypothesized that hardware should be designed for ]s rather than for human assembly programmers. Aided by ] speed improvements that enabled increasingly aggressive compilation techniques, the RISC movement sparked greater interest in compilation technology for high-level languages. | |||
Language technology continued along these lines well into the ]. However, the adoption of languages has always been driven by the adoption of new computer systems, and in the mid-1990s one of the most important new systems in computer history suddenly exploded in popularity. | |||
=== The 1990s: The Internet Age === | |||
The rapid growth of the ] in the mid-]'s was the next major historic event in programming languages. By opening up a radically new platform for computer systems, the Internet created an opportunity for new languages to be adopted. In particular, the ] rose to popularity because of its early integration with the ] ], and various ]s achieved widespread use in developing customized applications for ]s. Neither of these developments represented much fundamental novelty in language design---for example, the design of Java was a more conservative version of ideas explored many years earlier in the Smalltalk community---but the widespread adoption of languages that supported features like ] and ] ] was a major change in programming practice. | |||
=== Current trends in programming languages === | |||
Programming language evolution continues apace, in both industry and research. Some notable current directions include: | |||
* Mechanisms for adding security and reliability verification to the language: extended static checking, information flow control, static ]. | |||
* Alternative mechanisms for modularity: ]s, ], ]. | |||
* Component-oriented software development. | |||
* Increased emphasis on distribution and mobility. | |||
* Integration with databases, including ] and ]. | |||
== Programming language implementation == | |||
Computers cannot directly execute programs expressed in a programming language. The internal representation used by computers for data and operations consists purely of a long sequence of "on" and "off" signals, or ]. | |||
TODO: FILL IN | |||
== ] of programming languages == | |||
It is difficult to design a single overarching classification scheme for programming languages. Unlike the ] ] of ] ], any given programming language does not usually have a single ancestor language. More commonly, languages arise by combining the elements of several predecessor languages with new ideas in circulation at the time. Ideas that originate in one language will diffuse throughout a family of related languages, and then leap suddenly across familial gaps to appear in an entirely different family. | |||
The task is further complicated by the fact that languages can be classified along multiple axes. For example, Java is both an object-oriented language (because it encourages object-oriented organization) and a concurrent language (because it contains built-in constructs for running multiple ]s in parallel). ] is an object-oriented ]. | |||
This section presents two taxonomies of programming languages: a classification by ''programming paradigm'' and a classification by ''intended domain of use''. | |||
See also: ] | |||
=== Classification by programming paradigm === | |||
==== Procedural languages ==== | |||
], ], ] | |||
] | |||
] | |||
ADT-based languages: ]? | |||
]? ]? | |||
==== ] ==== | |||
], ], ] | |||
], ], ] | |||
], ] | |||
], ], ], ], ] | |||
See also: ] | |||
==== ] ==== | |||
], ], ], ], ] | |||
==== Logic and constraint languages ==== | |||
<b>Logical languages:</b> ] formulates data and the program evaluation mechanism as a special form of mathematical logic known as ] and a general proving mechanism called ]. | |||
also called '''declarative languages''' or '''purely declarative languages''' | |||
], ], ] family | |||
], ], ] | |||
==== Multi-paradigm languages ==== | |||
Procedural languages with an object system grafted on top: | |||
], ], ], ], ], ] | |||
Functional/OO hybrids: | |||
] | |||
Languages with an embedded constraint/logic programming subset: | |||
] | |||
==== Machine code, assembly, and compiler intermediate forms ==== | |||
LEFT OFF HERE | |||
''']''' is code that is directly executable on a hardware processor, and consists of a chunk of binary data. Most people do not classify machine language as a programming language, although | |||
Its scope is architecture-dependent. It is typically formulated as numbers expressed in ] or ]. Each group of numbers is associated with particular fundamental operations of the hardware. The activation of specific wires and logic controls the computation of the computer. | |||
<b>]:</b> Assemblers are almost always directly tied to a machine language. Assembler allows these machine instructions to be written in a form readable by humans. Assembler allows a program to use symbolic addresses which become absolute addresses calculated by the assembler. Most assemblers also allow for ]s and ]s as well. | |||
=== Classification by intended domain of use === | |||
==== Foundational languages (core languages, calculi) ==== | |||
==== General purpose languages ==== | |||
], ], ] | |||
==== Systems languages ==== | |||
], ], ], various forms of ] | |||
==== Scripting or embedded languages ==== | |||
], ], ], ], ] (scripting for Java), ], ], ], ], ], ], ], ], ], ], ], ], ] (ECMAScript), ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ], ] | |||
==== Domain-specific languages ==== | |||
An important class of domain-specific languages are the database manipulation languages. ] | |||
... provide powerful ways of searching and manipulating the relations that have been described as entity relationship tables which map one set of things into other sets. | |||
] languages ],] | |||
] 08:37, 11 Mar 2004 (UTC): Just wrote ] before I looked at this rewrite page; perhaps it could be useful to you. | |||
==== Educational languages ==== | |||
], ], *] | |||
<nowiki>*</nowiki> designed partially as an educational language | |||
==== Concurrent and distributed languages ==== | |||
] (by Per Brinch Hansen), ], ], ], ], ], ], ] | |||
== Commonly Used Languages == | |||
The following languages are used by several hundred thousand to several million | |||
programmers worldwide: | |||
* various forms of ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
* ] | |||
See also the ] and ] of programming languages. | |||
== Related articles == | |||
*'''List of programming languages''' | |||
**] | |||
**] | |||
**] | |||
**] | |||
*] | |||
*] | |||
== External link == | |||
* |
Latest revision as of 21:43, 28 June 2006
Note: This article was once intended to someday replace programming language. Another user helped it go live, so it's no longer needed. I am leaving the page here to preserve the edit history, but I have blanked its contents to prevent it from appearing in search engine results. k.lee 20:41, 25 June 2006 (UTC)