Today, each programming language is an isolated kingdom with its own rules, infrastructure, and even culture. Text-orientation can lead to a new landscape of fully integrated programming languages, each with its own purpose and character but combining perfectly with each other.
The text-oriented conception of programming languages does not perceive them as means to code algorithms and data structures, but rather as means to code text. The code is seen not as consisting essentially in expressions in a particular language, but as consisting in parsed text structures that happen to have been entered by particular means and can be output at will in other terms.
With current languages one implements software. With text-oriented languages one specifies it, that's the basic principle. The programmer describes the software, the compiler constructs it.
In current programming languages there is already an incipient text-awareness, for example in macro expressions, in preprocessors such as PHP, in embedded languages such as SQL embedded in C, in general in language interpreters and evaluation functions. In each case the text is implemented as a character string which must be rewritten or interpreted, before it can be compiled or executed. Text-orientation does exactly the same thing, but not limiting text to a string expression, building upon parsed text instead. The current rudiments of text-awareness get fully developed by text-orientation, as they are replaced by a single principle that covers each of them and integrates them all.
Note on Lisp. Unfortunately I only know Lisp superficially. It is probably the most text-aware programming language that exists nowadays. See for example what the Lisp hacker Paul Graham says about it:
Lisp looks strange not so much because it has a strange syntax as because it has no syntax; you express programs directly in the parse trees that get built behind the scenes when other languages are parsed, and these trees are made of lists, which are Lisp data structures.
Paul Graham: Revenge of the Nerds. In ”Hackers & Painters“, O'Reilly, 1st ed., 2004, p. 188.
There is a close resemblance between this and my conception, with the difference that the underlying structure I propose for the parse trees is not a list, but the general text structure, and that I do not apply this structure to a single language, but to all of them. The main difference though is that Lisp remains an implementation language (it runs on an interpreter) and it is thus not so generally applicable as text as specification structure.
Concept of Language
Our theory does not consider as usual the language as a basic concept, and then define a particular text as its production. On the contrary, here the basic concept is the text, and a particular language can be recognized in an existing collection of texts. The text is universal, each language a particular one.
Language is a structural property of a text base. A language is a layer in a text corpus that provides some ground text units and ways to combine them. Language is urbanization: common things get handy, being expressed with minimal means. Language is infrastructure: it is a text factory that multiplies the productivity for general-used products. Language as a tool can be improved upon experience and transmitted between generations as heritage.
A language is not a closed set, there is always a margin in its bounds. If you analyze an existing text corpus, you can consider some text units to be part of the language or to be particular productions. If you create a language for producing new texts, there are many different limits you can set.
A language can have an own notation. It is a sign of maturity, but languages exist without one, and some have more than one. A notation is a user interface for a language, for both human and automated agents.
The text theory states that languages are not intrinsically isolated from one another and their expressions can all be reduced to the general text structure. Languages are thus equivalent to each other as far as one can build the same text structures with them. What is the essence of a particular language? When talking about a particular language, one refers to particular semantics, or to a particular coding system, or to both.
The language semantics build a fundamental text layer all expressions are based upon. This does not get lost after parsing the code, but it has an effect afterwards. In fact the majority of expressions are small join sentences that just combine preexisting large amounts of assertions. In this respect languages are extremely important and have a direct impact on working systems.
On the other hand languages provide a coding system, an arrangement of lexical baggage, syntactical rules and notation that one can use to produce new texts and to express existing ones. Although this vanishes after parsing and does not affect the running system, it is extremely important as human interface to text. This has a technological aspect: a compact, precise, clear language is a tool that results in effective, sound work. This has a mental aspect, too. The language interface creates the user experience, the world one lives in when writing and reading. This determines how we understand and imagine things, too. The language coding is therefore for us close to the semantic structures, thinking about coding ways can lead to semantic improvements, and coding facilities provide access to semantic functionality without requiring theoretical instruction.
To sum up, language diversity is desirable. Despite the fact that a general-purpose text language is possible and useful, it should be used for theoretical and implementation purposes and as an optional universal, well-known way of expression, but it cannot replace the multiplicity of languages, each of which is unique, apart from semantics, for the technological and human aspects.