Monday, February 9, 2009

Language and Evolution

Language and Evolution

Basil Gala, PhD.
(5557 Words)

The Christian bible says "in the beginning was the word, and the word was God." Word is the translation of "logos" in Greek, the language of the Bible used by educated people in the times of Christ, "logos" referring to Jesus. This is a prime example of the importance of language in human life. Recently, researchers in human evolution have found that the development of speech and the speech center in the left hemisphere of the brain played a critical role in the emergence of humans as a distinct species from the common primate ancestor of Homo sapiens and chimpazees. A large vocabulary, much larger than the thirty of so sounds chimps make to communicate, was the beginning of human divergence from the common primate stock. Over a period of about twelve million years pre-humans and true humans expanded their vocabulary to an average of sixty thousand words today. A powerful language was more important to human evolution than walking on two legs (bipedal locomotion), or using hands on tools and weapons. As an organ of nerve cells (neurons) in the back of the brain (the medulla oblongata) grew substantially in size and complexity to handle balance and stability on two legs, as the motor center center grew to allow more adept manipulation of objects, so did the speech center become enlarged and more capable to deal with words. More power with words meant better thinking to plan, to remember, and to communicate, granting a major survival advantage to those possessing it; the genes augmenting this power with words spread quickly in the population. Humans evolved with the use of language at an accelerating pace. Individual minds evolve similarly in childhood and adulthood, in social interactions and formal schooling. Can we make this evolution of mind a partly conscious and controlled procedure? We are beginning to understand much better how language works to make us more effective, and in the near future we will be able to control the process of creating new concepts for our natural language, thus expanding our minds in unimaginable ways.

Language evolves; we see that in children who replay the entire evolution of life from the womb outwards. At about two years of age or so, children begin talking with simple and halting words and phrases. After a few months of baby talk, connections are established in the brain (as they are for walking), and talking usually comes out with a rush, a replay of human verbal evolution. As the child grows into an adult, language becomes more complex and effective, reflecting the evolution of the brain. In old age, language, writing, and brain activity deteriorates back into childhood patterns. Sometimes, language, thought, and behavior degrade before old age; we see evidence of that on walls covered with graffiti, airwaves filled with rap music, and theaters crowded for the violence of films with cartoon superheroes. Overall, however, as long as civilization continues on our planet, language and thought are advancing in effectiveness, nobility, and power. Evolution to more powerful modes of language does not need to come to an end.

Language is made up of words, which are strings of symbols, such as letters together with other special characters. Symbols make it possible for us to communicate better with each other and with ourself, but gestures work well too as you can see if you are watching deaf mutes or Italians. Gestures are symbols too, as are postures. Are symbols necessary so that we may think? Not at a basic level, because we can think with pictures, also configurations of sounds, even of scents, tastes, and tactile sensations. We can guess how prehumans thought before symbolic language emerged. In many cultures the first written words were pictorial, such as the early hieroglyphics of ancient Egypt, which were later stylized, and eventually became brief symbols, similar to what we use today to form our words and sentences. The paintings on the walls of the Altamira caves in Spain, exquisite as they are, were prehistoric expressions of language. Advanced, more complex and effective, thinking requires abstract concepts and their corresponding words, such as justice, science, evolution, computation, history, integration.

What exactly is a concept? A concept is a classification of sense data or other input to the mind into organized groupings so you can respond appropriately for your survival to a challenge from your environment. For example, you may see certain moves by someone which you recognize as threatening and respond by taking defensive measures. You have conceived you are being threatened; classifying the sense data you have received, the concept you have derived is that of threat. Alternately, other moves you see, you may classify as humor or love. Concepts thus defined are similar to Emanuel Kant's categories, a major feature of his philosophy of logic.

I am concerned here with advanced concepts of abstraction, a high-level natural language. In computer science, researchers developed a number of high-level languages to program computers with greater ease, such as FORTRAN, Pascal, and C++. My aim here is not to get strictly technical and discuss languages for calculations, symbol manipulation, and picture processing, but what we have learned from computer linguists can be of some use in understanding natural languages. A natural language is much more extensive and versatile, capable of dealing with a great variety of subjects including feelings, than a computer language limited to just a few hundred words and symbols that put the machine into a few digital operations . English has millions of words, capable of dealing with every thought, sensation, and emotion we can experience; a natural language is constantly changing and expanding in response to new experiences and challenges facing a human population.

Today we are facing such fearsome challenges as we did when as a small band of Homo sapiens we emerged in arid East Africa a hundred thousand years or so ago, challenges which threaten our existence again as a species. I expect if anything can save us now that has to be a new and transcendental way of talking, writing, and thinking. Transcending old ways of dealing with the world and with each other, we reject confrontation, exploitation, and destruction, to greater cooperation among ourselves and with the web of life on our planet. The age-old ways of speaking and thinking worked well when populations were small around the earth, tribes hemmed in by powerful predatory animals and a harsh nature with what appeared unlimited resources: vast forests, deep oceans, and unscalable mountains.

From an orbiting satellite, the Pacific Ocean looks like a pond, and Mount Everest a mere molehill. The forests are burning everywhere to make room for fields and buildings, or being cut down for fuel and lumber. The challenges are: destruction of human habitats, depletion of natural resources, planetary pollution, overpopulation, intractable epidemics, and warfare with ever more powerful weapons. We have to stop doing what we habitually do, and think with new concepts and words, suited to our present predicament.

Words or other symbols refer to things or concepts, which are called referents. Words represent referents; they take the place of referents in communication and in thinking. Using words we can do work in the mind without dealing with the actual things they represent, like mentally traveling on a map instead of the actual terrain, looking at hazards, time constraints, and destinations. Words allow us to organize, plan, command, entertain, motivate ourselves and others, lead, woo, impress, but mainly words allow us to think on an abstract level. Words can also mislead, so conditioned are we to them after many years of using them daily. Say the word lemon, visualize the fruit, and your mouth fills with saliva. Recite your national anthem, and your heart fills with pride, ready to drive you to battle.

What comes first, the word or the concept? Clearly, the concept. First we form concepts, such as pride, pointing to the emotion, then we configure the word or symbol to represent it. Conceiving new concepts is the creative part of language. There are now words for every concept so far conceived by humans. Where do I find new concepts of value? How to I spur the mind to bring forth a new concept of importance?

Such was the concept of science, the scientific method. René Descartes and Francis Bacon conceived the scientific method in the seventeenth century A.D. Two thousand years earlier, in the sixth century B.C., Democritus and Thales did science without formalizing a method for it. A century later, Plato and the Socratic philosophers gained influence with idealism, which overwhelmed early science in Greece. Idealism spread with Christianity and the Romans to all of Europe. The idealists deal with ideals and ideas: words often without referents, without end, without observations, measurements, or experiments.

We know now that when we question Nature with experiments and carefully observe her answers, that is how we arrive at new and useful concepts. We don't need to exclude our own minds as objects of observation and study. Our minds are part of Nature also; we can study our minds with introspection and meditation as the ancient Indian gurus learned to do long ago, making vital discoveries about our spirits. Combining Eastern wisdom from meditation with the practicality of Western science, connecting the immense outer universe with the equally vast inner one, human evolution may reach levels of thought we cannot even imagine today, with our most potent concepts yet to be conceived.

Again, how do we conceive concepts? When I think about a problem on any subject, words and symbols spring to mind which I have not coined. Am I thinking my own ideas or rehashing those that I have read? But if I have a new or unsolved problem and if I come up with a unique solution, then I can coin words to fit the concepts I have discovered in the struggle for the solution. Posing a question is the way to discovery, learning and evolution. Sometimes the question is foisted upon us by nature; it is a natural challenge for survival, as it is imposed on simpler animals and plants, a challenge to which we must respond by changing ourselves to be fit the solution. Bees, ants, and other social insects have hit on the solution to their survival problems; they have stuck to their social devices for millions of years, the queen mother, workers, and warriors, without much change in behavior. Birds build their nests in the same stereotyped way for each species; the cuckoo bird has its own peculiar way of taking care of its young, and so does the wasp planting its eggs in a tarantula. Such animals don't deviate much from their habitual behavior.

Advanced people explore and deviate from what they are accustomed; they keep posing questions to themselves out of curiosity; they do not rest until they have answers or they have died.

Such was Carl Friedrich Gauss, interested in the statistics of disease and death, working for the life insurance fund of the German government. He posed the question: Can we predict how many people will live to what age in a large population of insured individuals? It was important to fix these statistics to determine premiums for policies. Studying this problem Gauss came up with the concept of a frequency distribution, called today Gaussian, commonly known as normal or bell shaped. Pose an important problem, find the answer; if the answer is new and original, you have a concept you can name, or your name becomes a suitable word for it.

How do we produce suitable words? Who were the people who invented the words in Websters dictionary and how did the use of the new words spread in the population, becoming standard English? Many of the words scholars imported from Latin, Greek, and other languages, used them in their books and speeches and gradually the circle of their use expanded among educated people and later to the population at large. This explanation only translates the problem to other countries. How did the Greeks go about about inventing their words? Spoken language predates its written form by many thousands of years. Many words are imitations of natural sounds: murmur, screech, whistle, crash, bang, etc. These phonetic words don't take us very far into the dictionary. I can imagine that some bright fellow, a leader or primitive intellectual of an ancient tribe, woke up one day and got the notion of naming objects not previously named by the tribe. That person picked up a rock and called it such; a piece of wood, the same; then a handful of water. Later the leader talked to fellow tribal members and instructed them using the new words. Everyone imitated the leader as monkeys do today, the new words accepted and their use spreading quickly because they were useful for communication and survival.

Today scientists and engineers do similar feats with language when they observe new phenomena in their researches. Consider the language of organic chemists, describing combinations of atoms in compounds, with words made up by compounding the names of the elements in a molecule or that of physicists explaining the structure of subatomic particles with bosons, mesons, leptons, gravitons, and quarks.

Naming things, such as objects, sensations, even feelings, is not creating abstract concepts. This way we are using concrete words, which are fine in literature, especially in metaphors. We get abstraction when we name classes of things. In this sense the word chair is abstract, unless we are pointing to a particular chair, because it names pieces of furniture with legs and a top on which we sit. Furniture is the genus to which the species chair belongs; furniture includes many other artifacts besides chairs, and is therefore more abstract than the word chair. Furniture in turn is a species in the genus of human artifacts. We can continue this process of naming broader classes of things until we get to the word universe or God, containing everything that exists.

How about imaginary things, which presumably do not exist, such as unicorns and leprechauns? Anything we can imagine does in a sense exist;it becomes part of the universe when we have thought of it, since we are parts of the universe. Certainly this is so when we externalize our imaginings, putting them in some more durable form such as a piece of writing, a painting, sculpture, or any object we make from materials in nature. In our modern world we are constantly living with products of the human imagination, from televisions, cellular phones, computers, millions of artifacts, constantly emerging. We have thus become creators as well as namers of abstract things.

Abstracting is a process of composing a larger entity from smaller elements. This is the case in music, where we have an infinite variety of themes developed from the seven musical notes: do, re, mi, fa, sol, la, si. The song you enjoy so much is simply the composer's abstract thought set to music. Similarly, a beautiful mosaic picture is made up of thousands of small colored pebbles. Seurat painted his sublime canvasses using points of paint, an art called pointilism. Van Gogh created his masterpieces with a few simple strokes of his brush. Hemingway did the same with small, simple words of the English language, yet his effects were very compelling in "The Sun Also Rises," "A Farewell to Arms," and "The Old Man and Sea." Hemingway crafted big effects in his stories with small words.

Words from a natural language like English are like field stones: they have different shapes, colors, materials, sizes; computer words are bricks or cement blocks, manufactured, not grown from living experiences. The author builds a structure with care using natural words, a thing of beauty and meaning, a new abstraction.

The point is: abstraction and creation emerges from putting together simpler elements, called primitives. What enters the composition in this synthesis of elements is the ineffable breath of design which gives it life, a breath from the spirit of the creator, human or divine. Analysis breaks down the design into its basic parts, isolating each part from the other parts, robbing it of vitality. Analysis can be very instructive to the designer, like reverse engineering by imitators, but for the rest of us criticism is decay, devoid of joy, with a bad smell. We are left with elements which amount to almost nothing, ghostlike, like the remains of matter left from supercollider experiments at CERN or Fermilab.

As to the most fundamental elements, the primitives in any composition, what are they and what faith can we place on these elements? In geometry we accept the existence of what we call a point and a line without providing any definitions for these. With the point and the line taken as given, together with a few other such undefined elements and unproven truths called axioms, Euclid build up his entire structure of geometrical theorems. When we analyze anything, eventually we get to precepts and truths we accept on faith. These have been called inborn, or God-given truths. Induction, for example, in logical thought is a mental process common to all humans not mentally impaired.

Take a look at the sequence of numbers: (2,3,4) (3,4,5) (4,5,6). What comes next in the sequence? Any human will use the inborn trait of induction to reply (5,6,7). The most intelligent chimpanzee, although capable of the most incredible feats of temporary memory, will fail this test for induction. A human child will pass it readily.

Besides induction, we naturally accept as valid the mental process of exhaustive search. When we enumerate all possibilities in a problem, and we select one or more to use, we believe we have a valid approach. From this springs the value of truth tables, which we employ extensively in logic and digital design to prove the truth of an expression of variables. We assign a value of true (T)or false (F) to each input or variable, and a value to the output. For example, we define the logical function AND for two inputs (A, B ) by the truth table:

A B A.AND.B

F F F
F T F
T F F
T T T

In other words, we accept that logically the output is true, only when both inputs are true.

Similarly, we accept a double negative as positive. We say, if someone is not dishonest, that person is honest.

Such notions are common sense, the endowment of every normal human being when mature enough, as Plato shows in his Socratic dialogue of "Meno." With simple questions, Socrates guides an illiterate slave boy to prove a non-trivial theorem in geometry.

We can assume that our ability to employ basic precepts, logical or of other kinds, evolved as we evolved as a species, or if we are religious, these precepts were granted to us by our Creator. As an impartial observer in the arguments on religion, I have no doubt that our bodies have evolved together with our primate cousins, but at some points in time factors entered the evolutionary equation for humans from a realm outside ordinary reality. Our minds, not our bodies, are the result of this influence.

We can begin to see now how to grow and expand our minds, seeking new words and concepts: by keeping busy creating, composing, synthesizing; going into analysis only to learn structures as they exist now, then freeing ourselves from old disciplines to build something new with the use of our inborn precepts. In this process we escape from logical, established, thinking and use our right brain to explore problems in a non-linear, wholistic or integrated fashion.

What we prize most in somebody's work is not something done to perfection, following tradition, an old recipe, or formula, valuable as those may be, but something from the spirit of the worker, different as the worker is from others, different, and new, and wonderful.

Yet, observe carefully around you, most of us are content to think as we were taught in family and school with all the well-tested concepts handed down to us through the ages; we live by the habits we acquired in childhood, refusing to change even when these habits of thought and behavior lead us down to perdition. Why is this so? Change is shaky ground for most of us; it is fearful, uncomfortable, even painful. How many of us seek adventure and knowledge with a passion? For an adult, change in established habits is very, very difficult, almost impossible. Yet, the will operates like a rudder on a supertanker. A few people recover for life from addictions, such as alcoholism, overeating, or gambling, somewhere between one to five percent. We are all slaves to our habitual thoughts which may have served us or our ancestors well in the past; we are prisoners inside a steel cage of entrenched concepts.

I too am a prisoner and find it virtually impossible to change my ways, even when I see that I desperately need to do so. I am trapped in a web of concepts, woven for me by others in the past, yet I refuse to accept my situation. I dig tunnels in my prison to escape. I seek the clarity of reason, now and until my end, searching for more efficient concepts. I seek to adapt, to form new ideas and behaviors, slowly, painfully, breaking away from harmful habits of thought and behavior to improve myself.

I want to explore the world of ideas, to find new models for the data available to me, new paradigms, or patterns. I conceive new patterns by focusing, concentrating on the problem I face; concentration may be the main difference between the innovator and the hack. Can you think of anything of importance somebody achieved without possessing great passion? A genius is a smart person who struggles fiercely to achieve, focusing the attention on a subject with a laser light until a meaningful pattern emerges; thus the clever person improves in mind and life.

We improve and evolve into new worlds of thought and language, submerged in a different state of consciousness of deep meditation. Language is a left-brain activity, while mental exploration, creativity, is a right brain activity. It is their combination that leads us to pose new problems and locate their solutions, with the right balance of disciple from our left brain and freedom to dream from our right brain; a balance that also works well for a creative society.

I suspect our ancestors diverged form other apes when they began to meditate: to enter into an alternate state of consciousness, leading to greater awareness, perhaps contacting a source of wisdom much greater than ourselves, and acquiring new concepts doing so.

A new concept is something we see clearly in our minds, a pattern we recognize that we had not sensed previously, a model derived from a collection of input data organized in a meaningful form, to which we give a name. Computers, in spite of their enormous computational powers and memories, are still unable to equal humans in pattern recognition ability, even in apparently simple tasks such character or speech recognition. Programmers keep trying to develop systems to do such functions and to perceive structures in data, but progress is slow in this field. There is a barrier to progress in machine intelligence; we are getting closer to the design of a smart computer, but the difficulty of designing it seems asymptotic, as if God does not yet want us to get to it. Our inborn precepts or truths will not transfer to our machines so far.

Now as to these native, wired-in, precepts and axioms, how can we acquire new and more advanced ones? Our inborn truths are God-given or Nature-given, representing our basic capabilities as human beings. As a male peacock has fine tail feathers, as a song bird has lovely notes, as a seagull has great wings for flying, so us humans possess our basic precepts of certain fundamental facts to guide our reason and feelings. Computers at the machine level can perform a number of operations on switches, turning them off or on, that we represent as 0 or 1. Each computer design is capable of a certain repertoire of such binary manipulations; it cannot exceed its repertoire of what it can do, no matter how sophisticated a high-level language we employ in our programming. Yet, employing operations on 0 and 1 on memory registers, computers can do an enormous lot of activities, which we all experience today in appliances, cell phones, in movie graphics, and the Internet. In 1946 Alan Turing showed that a elementary digital machine with the simplest memory, a stored-program computer, could do any computation imaginable, given good programming and sufficient time. It is the same with humans; although limited in our capabilities by our internal wiring, with good programming we can achieve great things as individuals and as a species.

To do more than we are wired to do, we will need to redesign our brains, which may be possible with the new technologies of bioengineering. Presently we are stuck with our inborn precepts. As flies keep buzzing to get through the glass of a window, so we butt our heads against some problems we cannot solve given our mental equipment.

In the meantime, we have our current powers of analytical (logical) thought, and of meditation, which two powers most of us use too little to achieve much of value in our lives. Excellent treatises are available, from Aristotle to Carnap, on symbolic logic; as for meditation techniques, we have Herbert Benson's "The Relaxation Response," and "The Breakthrough Principle." Dr. Benson was a Harvard medical researcher, but these techniques are available to us in many good sources for tapping our native powers of creativity. Basically, to augment thought we need to sweat out the details for quite a while, observing nature, collecting data, recording, experimenting, measuring--doing good old science almost to exhaustion--then, relaxing with walks, music, baths, etc., until we get our breakthrough to our discovery or invention. If the breakthrough, the main insight to the solution of our problem, does not yet come, we repeat the above.

The above cycle of work can be distributed between different people. For example, much progress in physics is done through the collaboration between experimental physicists and theoreticians, such as Leon M. Lederman at Fermilab and Murray Gell-Mann at Caltech. One collects vast amounts of data with ingenious experiments, and the other studies the accumulation of fairly raw information and models thoughts to fit, providing explanations. Consider the collaboration of Carl Friedrich Gauss in the eighteenth century, cranking out mathematics in Göttingen all his life, with of his contemporary world traveler, measuring guru, Alexander von Humboldt. Together they contributed much to our understanding of magnetism. The gauss is the unit of magnetism.

Similarly, Einstein is credited with many important advances in theoretical physics, which he would not have achieved without the data provided to him by experimenters. Could a mind such as Einstein's think without words, without symbols? Yes, but not very well, very far, or for very long. In his youth Einstein left high school and traveled to Tuscany, Italy, where in the clear light of the country he conceived special relativity. He described those gossamer early notions later in his twenties using the language of mathematics. At the time of his death, his blackboard at Princeton was covered with equations on general relativity, equations still preserved by the Institute of Advanced Studies. Einstein needed to write down his thoughts to make progress and conserve their memory. The story is told of Gauss jumping out of his wedding bed to record an equation that occurred to him while making love to his young wife.

It is clear we need symbols which we manipulate like things in order to advance thoughts to fruitful conclusions. In mathematics we start with simple numbers: not so simple actually, as explained by Gauss in his monumental "Disquisitiones Arithmeticae," which was his doctoral dissertation at twenty years of age. Anyway, somewhere in our human past we started counting things; numbers emerged and the beginning of mathematics. Numbers were a monumental discovery. Chickens cannot count above two. If a hen has lost a number of her chicks to predators, she's still clucking happily if she has at least two left; or maybe she's just stoical. Indians invented the Arabic numerals we use today, including the powerful zero. The ancient Greeks and Romans, in spite of their advances in mathematics, did not use Arabic numerals but letters for calculations, with a blank in place of a zero. We use letters in algebra, an Arabic word, to solve problems with known and unknown numbers. Algebraic equations were a higher level of abstraction. Then René Descartes in the seventeenth century gave us the coordinate system; we could now represent algebraic equations pictorially. The equation y=mx+b, could be shown to be a straight line intersecting the y coordinate at b and with a slope of m. We now had analytic geometry and trigonometry, a still higher level of abstraction in mathematics. Moreover, mathematicians devised a solution for the square root of a negative number, inventing complex numbers, which were given the unfortunate name of imaginary as opposed to the old real numbers.

In the same fruitful seventeenth century, Newton and Leibniz invented calculus independently. Newton was looking for a practical tool to calculate motions, while Leibniz developed calculus from a rigorous mathematics viewpoint. Calculus, which had puzzled the Greeks, including the superlative genius of Archimedes, in the squaring of the circle, pi times r squared, includes the fantastic notions of the infinite and infinitesimal; the number pi could now be calculated to any degree of accuracy, instead of being estimated. Differential equations emerged from calculus, with much more power of abstraction than algebraic equations; scientists could now apply differential equations widely to explain precisely previously intractable scientific phenomena, such as electromagnetism. With differential equations in vector form Maxwell could state the laws of electromagnetism concisely and elegantly, covering a vast array of physical eventualities in such phenomena, which could be investigated by integrating the equations. That's abstraction.

In the twentieth century we saw parallel developments of abstraction in other disciplines: psychology, sociology, economics, and the arts of painting, sculpture, music, and the theater. Progress continued in dealing with random phenomena and the previously intractable problem of probability, which were now placed on a firm footing of mathematical rigor, thanks mainly to French, German, Indian, and Russian thinkers, who discovered fitting axioms and proved theorems now bearing their names.

Again, the point is: we need language growth not only to communicate better with each other, but mainly to think better individually and together with other well-trained minds.

It is said that Nikola Tesla, the genius of alternating currents, could visualize mentally complex electrical machinery in every detail; however, his designs had to be put down on paper to be fabricated and tested. In the award-winning movie, "Amadeus," Mozart is depicted with entire symphonies in his head, later writing down the notes for the music without the smallest error or correction. Were those cases thinking without symbols? No. Mozart and Tesla could retain in the mind large amounts of the symbols in their discipline with the right organization or structure. I suspect Mozart did a lot of rewriting in his head before coming to the right form for a symphony or concerto. Actually, the structure of the symbols made the retention of the information possible for the two creative geniuses, thanks to their superb intellect and training.

Will computers every develop intellects such as Mozart's or Tesla's? Can we program digital machines to organize data and think as humans do? Over forty years ago, I. J. Good, a British mathematician, predicted the development of intelligent machines, which would help us design even more intelligent ones, an evolution resulting in machine super intelligence, surpassing our own by far, by the year two thousand. Arthur C. Clark, a British physicist and science fiction writer, wrote the novel 2001, with HAL, an intelligent computer as the protagonist. At the end of the story, HAL fails, but its astronaut companion grows to a superhuman with the help of an alien power, and returns to earth as a star child. That was advancing evolution with Deus ex machina.

We have seen many advances in digital machines in the past few decades: vast, super fast memories and central processing units, sophisticated programming for personal computers and main frames, and computer communications such as the Internet, all in very small packages; following Moore's Law, this progress continues. We have not seen smart machines, although we do have readers and speech recognizers with a 95-99% accuracy after training. We are seeing more of an integration of human and machine capabilities in many fields, with the human mind augmented by computers. We do much of our fact finding today with the help of Google and other search engines, collecting data from innumerable sites on the Net. We are getting closer to a superhuman-machine entity rather than a separate machine or human super intelligence.

With computer languages becoming ever closer to human ones, we are reaching the point when we will be communicating with our machines much more effectively, integrating our minds and bodies with their circuitry, and starting on the road to evolving as Homo digital.

No comments:

Post a Comment