Educational CyberPlayGround ®☰ Menu

Computer Languages

2015 Theoretical Physicist Finds “Computer Code” in the Fabric of Space. He also makes a leap when he says " intelligent creator, and therefore it was not created by accident. In other words, the Prime Creator exists!" ~ anon

Linguistics LANGSEC

What the US Army Says:
The Chomsky Hierarchy
Time Complexity and Ploynomial Time
Other Papers:


WHAT IS CODE? by Paul Ford

A computer is a clock with benefits. They all work the same, doing second-grade math, one step at a time: Tick, take a number and put it in box one. Tick, take another number, put it in box two. Tick, operate (an operation might be addition or subtraction) on those two numbers and put the resulting number in box one. Tick, check if the result is zero, and if it is, go to some other box and follow a new set of instructions.

Writing and Programming

"Therefore it's name was called Babel, there the Lord confused the language of all the earth." ~ Genesis 11:9


Emil Kozole created Seen, a font that cleverly redacts certain words as you type—a clever automatic ligature hack. It comes in three cuts, with varying degrees of censorship. <more>

Seen is a font that has a preloaded set of sensitive “spook words” that the NSA and other agencies are using to scan through our documents. The typeface can be used in any popular software such as Illustrator, Indesign, Word or in a browser. It is used normally to write text, but once one of the words on the “list” is written - the font automatically crosses it out. Therefore giving you an overview of your text and highlighting where you are potentially prone to being surveilled. It gets its name by a Facebook action that happens when the other user reads the message.

See also: Christian Naths' Redacted Script, where every character is the same block or squiggle, designed to resemble redacted documents. Designers like them for making placeholder text genuinely abstract.


A graph of programming languages that consists with their influences, companies, developers, dialects, implementations.

Writing / Programming


What "girls can do".

"For me, a computer program was more like a poem than like a machine, and I self-identified as a language and music geek, more than the kind of kid who takes machines apart." ~ anon


Writing Grammar
If diagramming sentences is too arcane, then K12 schools should at least teach sentence structure and parts of speech (subject, verb, object). Perhaps requiring a second language would help too. A lot of us learned grammar best by learning a second language. If students learned grammar, then they would be prepared for "if/then" statements in programming! :-)





Rough Timeline of Web Technologies span.year

1972 Standford Artificial Intelligence Progect
PUB was an early scriptable markup language. It was similar in concept to today's web scripting languages, especially PHP and JavaScript. But, like Microsoft Word, its purpose was to create paginated documents. PUB was the brainchild of Les Earnest of the Stanford Artificial Intelligence Laboratory. Under his direction, I designed the language and implemented the compiler in 1971. It ran on the Digital Equipment Corporation (DEC). I worked at SAIL, the lab, as a consultant in 1967, and as an employee of Kenneth Colby, M.D., from 1968 to mid-1970.

2014 Why Coders Are Going Nuts Over Apple’s New Programming Language

When Apple unveiled a new programming language at its World Wide Developers Conference on Monday, the place went “nuts,” erupting with raucous cheers and applause. It was the coding-world equivalent of Oprah giving away all those free cars. WWDC is a gathering of people who build software applications for Apple hardware devices—from the iPhone and the iPad to the Mac—and with its new language, dubbed Swift, Apple is apparently providing a much faster and more effective means of doing so, significantly improving on its current language of choice, Objective-C. With something that Apple calls an “interactive playground,” Swift is even exploring a highly visual kind of programming that may go beyond other mainstream languages. All those developers went nuts not only because they love Apple, but because the new language could make their lives that much easier. If it lives up to Apple’s billing, Swift may also allow a whole new type of coder to build applications for devices running the iOS and Mac OS X operating systems. “It could lower the barrier to entry for Apple developers,” says Caylan Larson, an iOS and Mac OS developer based Winona, Minnesota who watched the WWDC keynote online and is already poring over the new guide that details the Swift language. “It could open a lot of new doors for a lot of people.” But there’s a flip side. Even though Swift could ultimately ease the process of building apps for Apple hardware, existing developers like Larson must first take the extra time needed to learn the new paradigm. “Do I put all my projects on hold while I pick this thing up?” he says. “It’s a balancing act.” Swift may look like the future, but the world is littered with programming languages that promised to make life easier for developers before ultimately fading into obscurity because people just didn’t want to deal with something new. What’s more, Swift seems to extend Apple’s split with the rest of the software development universe. Many coders would prefer that Apple shift toward tools that would also let them build software for other machines from other vendors. But Tim Cook and company are traveling in the opposite direction. “Swift has all the right check boxes, but do we really need something that’s proprietary to Apple’s platform?” asks programming guru David Pollak. “Yes, it solves a lot problems, but it’s yet another way to drive a wedge between iOS development and everything else.” This is unlikely to harm Apple any time soon. In fact, the company prefers things this way. It insists on defining its own rules, and its devices are so widely used, it knows that large numbers of developers will happily build apps for them no matter what language this requires, driven by the enormous dollars signs they see in names like the iPhone and the Mac. But life would be even easier for these developers if they could build applications that instantly ran on all devices, from Android phones to iPads. We were already a long way from that, and now Apple has taken us even further away. [snip]

2012 Engineers rebuild HTTP TO IMPROVE SPEED FOR a faster Web.
The formal process of speeding up Hypertext Transfer Protocol.
SPDY has a big head start in the market.
SPDY's technologies for faster HTTP include "multiplexing," in which multiple streams of data can be sent over a single network connection; the ability to assign high or low priorities to Web page resources being requested from a server; and compression of "header" information that accompanies communications for resource requests and responses. Better performance turns out to lead to more time spent on pages, more e-commerce transactions, more searches, more participation.
HTTP was the product of Tim Berners-Lee and fellow developers of the earliest incarnation of the World Wide Web more than 20 years ago. Its job is simple: a browser uses HTTP to request a Web page, and a Web server answers that request by transmitting the data to the browser. That data consists of the actual Web page, constructed using technologies such as HTML (Hypertext Markup Language) for describing the page, CSS (Cascading Style Sheets) for formatting and some visual effects, and the JavaScript programming language. Web developers can do a lot to improve performance by carefully optimizing their Web page code. But improving HTTP itself gives a free speed boost to everybody on top of that.
Users care about encryption, and the fact that modern mobile phones can handle encryption means that it's feasible for other devices to use it, too. And although an encrypted channel all the way from a browser to a Web server can damage the businesses of content delivery networks, which cache data on intermediate servers to speed up Web performance, the user should come first, he said. "Users care about privacy and security more than whether some guy can cache something in the middle," Belshe said. "Security is not free, but we can make it so it's free to users."





New Selectors
Finding elements by class (DOM API)

var el = document.getElementById('section1'); el.focus();
var els = document.getElementsByTagName('div'); els[0].focus();
var els = document.getElementsByClassName('section'); els[0].focus();

Finding elements by CSS syntax (Selectors API)

var els = document.querySelectorAll("ul li:nth-child(odd)");
var els = document.querySelectorAll("table.test > tr > td");

October 9, 2008 Tokeneer uses mathematical proofs to establish security
NSA posts secrets to writing secure code
[was ]
[now ]
Tokeener case study serves as an example of writing low-defect, highly-reliable code, researchers claim. By Joab Jackson
The National Security Agency has released a case study showing how to cost-effectively develop code with zero defects. If adopted widely, the practices advocated in the case study could help make commercial software programs more reliable and less vulnerable to attack, the researchers of the project conclude.
The case study is the write-up of an NSA-funded project carried out by the U.K.-based Praxis High Integrity Systems and Spre Inc. NSA commissioned the project, which involved writing code for an access control system, to demonstrate high-assurance software engineering.
With NSA's approval, Praxis has posted the project materials, [ ] such as requirements, security target, specifications, designs and proofs.

The code itself, called Tokeneer, has also been made freely available.

[was NSA ]

The Tokeneer project is a milestone in the transfer of program verification technology into industrial application," said Sir Tony Hoare, noted Microsoft Research computer scientist, in a statement. "Publication of the full documents for the project has provided unprecedented experimental material for yet further development of the technology by pure academic research.
Developing code with very few defects has long been viewed as a difficult and expensive task, according to a 2006 paper by Praxis engineers describing the work that was published in the International Symposium on Signals, Systems and Electronics. For this project, three Praxis engineers wrote 10,000 lines of code in 260 person-days, [was ] or about 38 lines of code per day. After the project was finished, a subsequent survey of the code found zero defects.
Moreover, Tokeneer meets or exceeds the Common Criteria Evaluation Assurance Level (EAL) 5, researchers said. Common Criteria is an ISO-recognized set of software security requirements established by government agencies and private companies. Industry observers have long concluded that it would be too expensive for commercial software companies to write software programs that would meet EAL 5 standards.
According to the 2006 paper, the engineering team used a number of different techniques for writing the code, all bundled into a methodology they call Correctness by Construction, which emphasizes precise documentation, incremental developmental phases, frequent verification and use of a semantically unambiguous language.
The developers wrote the code in a subset of the Ada programming language called SPARK, [Ada was named for Augusta Ada King, Countess of Lovelace, daughter of Lord Byron. ] which allows for annotations that permit static analysis of the program. They used the GNAT Pro integrated developer environment software from AdaCore.



We are all natural language searchers
Barney Pell's Powerset CoFounder, Lorenzo Thione argues that we are all natural language searchers. He surveyed the underlying themes in much of the criticism in the current blogstorm about Powerset and natural language search. read about

LOL, texting, and txt-speak



LOL, texting, and txt-speak: Linguistic miracles
A linguist surprises the TED crowd; apparently txt-speak really is special.

Is texting shorthand a convenience, a catastrophe for the English language, or actually something new and special? John McWhorter, a linguist at Columbia University, sides with the latter. According to McWhorter, texting is actually a new form of speech, and he outlined the reasons why today at the TED2013 conference in Southern California.

We often hear that “texting is a scourge,” damaging the literacy of the young. But it’s “actually a miraculous thing,” McWhorter said. Texting, he argued, is not really writing at all—not in the way we have historically thought about writing. To explain this, he drew an important distinction between speech and writing as functions of language. Language was born in speech some 80,000 years ago (at least). Writing, on the other hand, is relatively new (5,000 or 6,000 years old). So humanity has been talking for longer than it has been writing, and this is especially true when you consider that writing skills have hardly been ubiquitous in human societies.

Furthermore, writing is typically not a reflection of casual speech. “We speak in word packets of seven to 10 words. It’s much more loose, much more telegraphic,” McWhorter said. Of course, speech can imitate writing, particularly in formal contexts like speechmaking. He pointed out that in those cases you might speak like you write, but it's clearly not a natural way of speaking.
Javier Benek

But what about writing like you speak? Historically this has been difficult. Speed is a key issue. “[Texting is] fingered-speech. Now we can write the way we talk,” McWhorter said. Yet we view this as some kind of decline. We don’t capitalize words, obey grammar or spelling rules, and the like. Yet there is an “emerging complexity…with new structure” at play. To McWhorter, this structure facilitates the speed and packeted nature of real speech.

Take "LOL," for instance. It used to mean “laughing out loud,” but its meaning has changed. People aren’t guffawing every time they write it. Now “it’s a marker of empathy, a pragmatic particle,” he said. “It’s a way of using the language between actual people.”

This is just one example of a new battery of conventions McWhorter sees in texting. They are conventions that enable writing like we speak. Consider the rules of grammar. When you talk, you don’t think about capitalizing names or putting commas and question marks where they belong. You produce sounds, not written language. Texting leaves out many of these conventions, particularly among the young, who make extensive use of electronic communication tools.

McWhorter thinks what we are experiencing is a whole new way of writing that young people are using alongside their normal writing skills. It is a “balancing act… an expansion of their linguistic repertoire,” he argued.

The result is a whole new language, one that wouldn't be intelligible to people in the year 1993 or 1973. And where it's headed, it will likely be unintelligible to us were we to jump ahead 20 years in time. Nevertheless, McWhorter wants us to appreciate it now: “It’s a linguistic miracle happening right under our noses,” he said.

Forget the "death of writing" talk. Txt-speak is a new, rapidly evolving form of speech.


MACHINE LANGUAGE AND Artificial Intelligence - AI


Index of Machine Learning Courses. Maintained by Vasant Honavar, Artificial Intelligence Research Group, Department of Computer Science, Iowa State University.

Natural Language Processing Course Listing, part of the 2004 NLP Course Survey conducted by ACL (Association for Computational Linguistics).

A Brief History of Programming Languages
This timeline covers innovations in languages used for programming computers from 1946-1995. Entries include the development of FORTRAN  (mathematical FORmula TRANslating system) in 1957, COBOL (COmmon Business-Oriented Language) created in 1959, Bill Gates and Paul Allen's version of BASIC (Beginner's All-purpose Symbolic Instruction Code) in 1975, and more.


A curious chapter in AI history, where researcher Kenneth Colby used the Turing Test to see whether psychiatrists could distinguish between delusional patients and his natural language paranoia simulator 'PARRY'.
PARRY was designed by Colby, who was both a psychiatrist and computer scientist, in an attempt to simulate the psychology of paranoia. In particular, the programme was designed to replicate paranoid delusions about being persecuted by the Mafia. Dennett's 1990 article, entitled "Can machines think?", discusses whether the Turing Test is an adequate test of machine intelligence. Dennett notes that PARRY is the only programme known to have passed the Turing Test - psychiatrists were unable to distinguish between real patients and simulated ones.

Morse Code is really just learning to understand English plus some ham radio idioms.

Braille - learning to understand English as expressed in Morse Code, plus some local dialect.

Peter Wilkness of the CATANAL project (for native Alaskan languages) SEE Universal Declaration of Linguistic Rights

Hou tu pranownse Inglish with a sample lexicon and a set of spelling rules which you can use with my Sound Change Applier to automatically derive the pronunciation.

MIT OpenCourseWare: "a free and open educational resource for faculty, students, and self-learners around the world. OCW supports MIT's mission to advance knowledge and education, and serve the world in the 21st century." Courses are offered for example:

Computerate adj. Computer literate.

Greg Ulmer, an English professor at the University of Florida, says universities must "teach students to be as computerate as they are literate." His students use hypertext and multimedia elements in their writing assignments, he adds.

How people invent new words that become part of a language


Find where the word CyberSpace comes from a book by William Gibson called Neuromancer this s where the term "cyberspace" was first coined. Read both Neuromancer (chapter 1 ) and Snowcrash.

John Barlow
Rancher and lyricist of the Grateful Dead who applied the word Cyberspace to the internet. also see EFFs Open Audio License (OAL)

Michael Hauben Invented the word "Netizen"

Ted Nelson the inventor of "hypertext"

Ralph E. Griswold passed away October 4, 2006
He became a member of the Programming Research Department at Bell Laboratories in 1962, where he started research on symbolic computation and the design and implementation of high-level programming languages for non-numeric computation. This work led to the development of the first SNOBOL language. Subsequent work led to the SNOBOL4 programming language, which is still in use today. Mike Radow and Ed Feustel have started a SNOBOL mail list that will replace the defunct list on It is being hosted by yahoogroups under the name snobol.

Find the origin of SURF THE NET



WHEN unwanted email first came along, people invented different words for it, such as unsolicited email and junk email. But eventually "spam" became the word of choice to describe the phenomenon.
It's a process that happens each time a new thing needs a name, but language researchers have struggled to model how it happens without a central decision maker. Now a computer model shows the process at work - and may give insights into how the first human languages emerged.
Luc Steels of the Sony Computer Science Laboratory Paris in France and his colleagues studied the "naming game", a simple computer model that reflects how people invent words and use them. In the game, a group of "agents" live in a virtual environment with a number of "objects". Each agent makes up random names for the objects, and the agents then interact in pairs, trying to "talk" about those objects.
In each interaction, one agent (the speaker) says its word for an object, while the second agent (the hearer) listens. If the hearer fails to recognize the word, it memorizes it as a possible name for the object. But if the hearer understands the word, both agents retain this word in memory and ditch any others they have made up or heard.
Repeated over and over again, this process reflects how people invent and share new words for objects: they constantly invent new words, yet can only use ones that others understand, so it keeps a lid on the number of words in use.
The simulations showed that this is enough for the emergence of a unique shared vocabulary. In the model, each object always ends up being described by just one word.
"The model is as simple as possible," says Steels. "But it captures the main ingredients of how a population develops an efficient communication system." So could a similar process have helped the historical emergence of human languages?
"Absolutely," says linguist James Hurford of the University of Edinburgh, UK. But he emphasizes that in addition to common words, human language also requires richer structures such as grammar, the emergence of which the model cannot yet explain.
While Steels and colleagues hope to develop more complex models capable of evolving grammar, they already see potential applications in computing. For instance, programmers currently have to establish standards to get commercial or scientific databases to communicate effectively. It may soon be possible to get computers to talk to one another by letting them evolve a common language on their own.

Most likely born as a nonsense word in the mind of Dr. Seuss' 1950 book, "If I Ran the Zoo."
"And then, just to show them, I'll sail to Ka-Troo / And Bring Back an It-Kutch, a Preep and a Proo, / A Nerkle, a Nerd, and a Seersucker, too!"

In the 1930s, mathematician Edward Kasner asked his 9-year-old nephew, Milton Sirotta, what he thought would be a good word to describe a large number Milton suggested the word "googol."

Coined by Jonathan Swift in his 1726 satire "Gulliver's Travels" meaning unsophisticated

Adm. Grace Hopper of the U.S. Navy, a computing pioneer and the inventor of the COBOL programming language, told a story in which an operator solved a glitch in Harvard's Mark II computer by removing an insect a BUG from one of its relays. Edison, used the word bug to describe a problem in his phonograph. The Oxford English Dictionary cites this quotation from the March 11, 1889, issue of the Pall Mall Gazette: "Mr. Edison, I was informed, had been up the two previous nights discovering "a bug" in his phonograph -- an expression for solving a difficulty, and implying that some imaginary insect has secreted itself inside and is causing all the trouble."


PollyGlotto Website Language Conversion Tool
An animated talking language translator. It's a mashup between Google Translate and SitePal.



Language identification and IT: Addressing problems of linguistic diversity on a global scale*
Peter Constable and Gary Simons, SIL International
Abstract Many processes used within information technology need to be customized to work for specific languages. For this purpose, systems of tags are needed to identify the language in which information is expressed. Various systems exist and are commonly used, but all of them cover only a minor portion of languages used in the world today, and technologies are being applied to an increasingly diverse range of languages that go well beyond those already covered by these systems. Furthermore, there are several other problems that limit these systems in their ability to cope with these expanding needs. This paper examines five specific problem areas in existing tagging systems for language identification, and proposes a particular solution that covers all the world's languages while addressing all five problems.




OBITS: David Shulman "Sherlock Holmes of "Americanisms" / Code Talker
During World War II, he cracked Japanese secret codes for the Army, then returned to puzzles. He was a founder of the American Cryptogram Association,
in 1976 published "An Annotated Bibliography of Cryptography," still used by experts. {1}

Shulman, David. A Glossary of Cryptography. New York: 1981. [Petersen]
He was a champion scrabble player, and wrote a scholarly article about the  game's lexicography. ~ KE
OBITS: David Shulman
November 7, 2004
DAVID Shulman, a self-described Sherlock Holmes of "Americanisms" who dug through obscure, often crumbling publications to hunt down the first use of thousands of words, died last week in Brooklyn. He was 91.
His name appeared in the front matter to OED's epochal second edition, each of the Addition Series volumes and is currently on the website.
Shulman avoided excessive modesty, letting it drop that he was, at least temporarily, the last word on words that included "The Great White Way," "Big Apple," "doozy" and "hoochie-coochie".
Shulman's most pioneering effort concerned the term "hot dog". He found the word was college slang before it was a sausage, paving the way for deeper investigation.
Shulman obliterated a big impediment to finding the origins of the word "jazz" by proving it was on a 1919 record, not the 1909 version of the same disk. (Other scholars traced first use of the term to the baseball columns of Scoop Gleeson, a sports reporter writing in the San Francisco Bulletin in 1913.)
Shulman was first to challenge that "shyster" derived from a lawyer named Scheuster. Others, particularly Roger Mohovich, then traced the etymology to 1843 -1844. "Shyster" turned out to be a Yiddish corruption of a German vulgarism meaning a crooked lawyer.
Every inch of Shulman, from his well-worn trainers to his plastic bag crammed with scrawled notes to his soiled baseball cap, suggested the classic New York eccentric. He recorded his finds on index cards, sending them to the OED when he accumulated 100.
David Shulman was born on 12 November, 1912, and grew up on the Lower East Side speaking Yiddish, according to an interview in the Jerusalem Report in 1999. The first library of which he became a member was a branch in the Bronx.
After City College, he devised puzzles and puzzle contests for newspapers. During the Second World War, he cracked Japanese secret codes for the army, then returned to puzzles.
He was a founder of the American Cryptogram Association and in 1976 published An Annotated Bibliography of Cryptography, still used by experts. He was a champion scrabble player and wrote a scholarly article about the
game's lexicography.
After a heart attack in his early 80's, Shulman gave beloved possessions to the New York Public Library. Gifts included a primer from Colonial America, 20,000 century-old postcards and Bowery Boys novels the library did not have. He earlier donated his cryptography collection, including a book about secret writing from 1518.
His mentor at the library was Norbert Pearlroth, a famed researcher for Ripley's Believe It or Not!. However, Shulman later came to view him as less than rigorous.
"Instead of believing it," he said in an interview in 1999. "I believed it not."
Shulman never married; indeed, he made it clear he had scant time for his only relatives, two nieces who tried to stop him from giving his treasures to the library. "I hate to say it, but your relatives can be predators," he said in the 1999 interview.
Shulman always insisted that the persnickety pickiness he exemplified rates among the supreme virtues.
"What difference does it make?" he sputtered in an interview in 1989. "Why, the same difference as being literate or illiterate, accurate or inaccurate, telling the truth or spreading yarns."

Samuel Billison Code Talker November 18, 2004
WINDOW ROCK, Ariz. (AP) - Samuel Billison, a Navajo who as a Marine during World War II helped invent a secret code based on the tribal language to confound the Japanese, died Wednesday of a heart problem, according to the Navajo Nation.
Billison didn't have a birth certificate, but he was born on a reservation in the mid-1920s. He was believed to be 78.
Billison and other Navajo Marines, called the Code Talkers, used the code and their native language to communicate troop movements and orders, developing a secret vocabulary that renamed military armaments and equipment using rough equivalents in Navajo.
Airplanes became birds, ships became fish and weapons were named after common things. The word ''bomb,'' for example, was replaced by the Navajo word for ''egg.''
Billison joined the Marines after high school in 1943. He said he was sent to test as a code talker when he completed boot camp and the Marines realized he was fluent in Navajo and English.
The code talkers were not allowed to discuss their work when they returned home after the war.
The Defense Department first released information on the code talkers in 1968.
Billison was a longtime president of the Code Talker Association, and also served on the Navajo Nation Council. 

Hobo Language War Chalking, War Driving WiFi Hot Spots

Creator Bjarne Stroustrup
The Invention of C++ Programming Language

January 1, 1998, gave an interview to the IEEE's 'Computer' magazine. The editors assumed he would be giving a retrospective view of seven years of object-oriented design, using the language he created. By the end of the interview, the interviewer got more than he had bargained for and, subsequently, the editor decided to suppress its contents, 'for the good of the industry' but, as with many of these things, there was a leak. Here is a complete transcript of what was said, unedited, and unrehearsed, so it isn't as neat as planned interviews.

Date: Thu, 18 Dec 1997 08:02:35 +0000
From: "Mrs. Gail Watson"
Subject: Re: programming in the early years

This is an interesting string. At my school, the teachers are very interested in the students using whatever medium is available for the kids to begin learning "programming" skills. I put that in quotes because the exact definition is in doubt.

We teach html coding, logo and basic (yes, the latter two are ancient). The reason we do is simple: students must learn a syntax, apply the syntax, evaluate the output against what was intended or required, and then apply the ever popular "de-bugging" process to clear up errors in their syntax.

What you call it the above exercise (programming, etc) is not important. Actually I tell the children they are learning the same process that computer programmers go through. I think that is a fair statement. That way there is some connection with today's world.

A minority of students who really got in to it.

The mental exercise and the logical thinking is what's good for the kids. Obviously it's great if the latest and greatest tools can be used, but anything available will do the trick. We don't have a classroom full of computers that can support html, but we do have a lab that supports logo and basic. I recommend schools use whatever tools are available to get the kids thinking in a logical mode.




Mrs. Gail Watson
Computer Technologist
Pattie Elementary School
16125 Dumfries Rd.
Dumfries, Virginia, 22026 USA


Date: Fri, 19 Dec 1997 06:37:47 +0000
From: "Gary E. Karcz" <Gary.Karcz@NAU.EDU>
Subject: Re: Programming in the Early Years

This has been a quality discussion on the merits of teaching students tagging (HTML) and programming skills. We expect students to generalize the strategies developed within programming environments.

Most kids love to do any type of "programming" (I'm at a loss for a word that would be generally accepted), as long as it produces a picture they can print out and take home or put up on the web for parents to see! I tried math programming in basic last year, and discovered that only 1 or 2 kids in each class liked it. I thought kids would be excited to bring in their math homework, write a basic program to do it for them, print it out and turn it in. (The teachers all said they would accept it.) It was universally hated except for a small

But do they?

As of a few years ago, the results of studies focusing on the pedagogy of teaching students to program (HTML not included) are that the "jury's out" on the issue of skills-generalization -- these results indicating that problem-solving strategies are "domain specific." This means that while students are sharpening their programming skills, we may or may not see any direct benefit in other areas.

If anyone knows of current research on this topic, I would truly

appreciate an update!


Gary E. Karcz -+- -+-

© Educational CyberPlayGround ® All rights reserved world wide.