Showing posts with label computer science. Show all posts
Showing posts with label computer science. Show all posts

The Imprints Of Prince: Musician Inspired Massive Code-Teaching Initiative

The world lost a musical icon this week with the passing of Prince, but until his demise, few of his fans knew of the inspiration he'd offered, bringing about more adeptness and awareness for the future via technology.  No, not the song about partying like it was 1999 (although that was tremendously pertinent at the time.)  After an inspiring discussion with a proactive friend, Prince used his influence to support a nonprofit means of teaching kids how to write computer code.

He wants kids to rock hacks as well as he rocks an axe.
Which is to say, crazy good.
(Image courtesy

Let My People Browse: “Lifeline” Program Brings Internet To Low-Income Families

The internet has helped to level the field of knowledge for human beings worldwide, but those whose finances have prevented them from surfing the wild waves of the web shouldn’t have to suffer.  Now, thanks to a new ruling, low-income American families will be subsidized to provide them with access to all that the world-wide web has to offer...

Of course, this is assuming they have internet-enabled computers...
(Image courtesy

Let's Have A Big Hand For The Gesture Vocalizer!

With all the avenues of communication available to human beings these days, it's hard to imagine that there could be a demographic that still finds trouble getting their ideas across.  Yet, this is just the problem that deaf people have when trying to "talk" to people that don't understand sign language.  Now, thanks to four enterprising young engineers in India, this language barrier can be broken down...

A rudimentary version of the Gesture Vocalizer.
(Image courtesy

PyGest: A Python Tkinter Tutorial, Part 4

This article is part four in our ongoing Python tkinter tutorial series. When we left off in part three, we had just completed configuring the inputs for our interface, which included text entry, label and button widgets. We'll now move on to the widgets necessary for conveying the application's two outputs: 1) the hash digest value of the file input supplied by the user, and 2) a message indicating whether the file hash generated by the app matches the optional hash value that may be supplied by the user.

Euthanizing Youtube: Security-Testing Hacker Discovers Ultimate "Delete" Button

What if you had computer hacking skills of such astonishing power, you could bring an entire lane of the information superhighway to a screeching halt?  What would you do with your great and terrible force?  This week, one man was faced with this fascinating decision...

(Image courtesy

Forget Chess Or Jeopardy, This Poker-Playing Computer Is Near-Unbeatable

Sure, you might have survived dysentery playing Oregon Trail back in the day, or perhaps you currently enjoying slaying beasts or conquering lands in modern computer games.  But now, a new computer program can compete against even the most savvy players when it comes to a time-honored game of wits and skill:  poker.

Know when to hold 'em, know when to fold 'em, know when to walk away, and when to POWER OFF.
(Image courtesy

According to, a new poker-playing computer has emerged as a champion among gaming machines.  While there is a long history of computers being able to understand and perform exceptionally well in "perfect information" games (which are games where both players are aware of all decisions that have been made during the progress of the match, such as checkers or chess), a new system of "learning" allowed for the unexpected.

"Solving" the game of Texas Hold 'Em via a series of bets and bluffs, the Cepheus poker-playing computer fascinatingly was taught to learn from its own mistakes.  The scientists behind the project first instructed Cepheus in the basic rules of Texas Hold 'Em and had the machine play numerous games against itself to determine a variety of outcomes.  As this occurred, Cepheus compiled a list of "regrets", where it could have bet differently, bluffed, or folded for a more auspicious outcome.

All poker players know what that kind of regret feels like.  Cepheus actually learned from it.
(image courtesy

Cepheus was then programmed to act on its most major regrets while ignoring the lesser ones.  Eventually, a methodology emerged for Cepheus to navigate bets and bluffs in the most effective ways possible, and the "regret" list scaled down to near zero.  This mathematical take on the game allowed Cepheus to achieve near-perfect play.

Computer scientist Michael Bowling, the lead author of the study, explained that Cepheus's techniques could be extrapolated to a much wider set of purposes.  He explained, "...the techniques that we used to solve the game apply even more broadly than entertainment activities. I’m talking about any decision-making scenario. Politics becomes a game. Auctions become a game. Security becomes a game.”

"A game for me to WIN, mwahaha..."  -Cepheus, probably.
(Image courtesy

Feeling lucky?  You and Cepheus can duke it out here.  Just remember that while Cepheus's skills aren't quite perfect yet, the computer is operating well above the odds of chance thanks to its knowledge, and will likely beat you in the long run.  Having played over a billion billion hands (more poker than the entire human race has ever played) definitely gives it an edge, so don't lose your shirt!  The computer won't need it, anyway.

Cepheus is out there...don't get stung!
(Image courtesy

A Museum In Your Monitor: Immerse Yourself In Art In New Virtual-Reality Gallery

Do you like museums, but live in the middle of nowhere?  Do you long to gaze upon the world's artistic treasures, but are daunted at the thought of walking through miles of gallery halls just to spot one specialty?  Now, thanks to virtual reality, some of the finest art and artifacts are available for your perusal, in 3D, from the privacy of your own computer screen.

According to, the University of Sheffield in England has created the "Computer Love 2.0" program to make art enthusiasm available everywhere.  Navigated with an Oculus Rift system or simply a mouse and keyboard, the Computer Love 2.0 program takes the viewer through virtual versions of Sheffield's National Fairground Archive, the Turner Museum of Glass, and the Alfred Denny Museum.

If you don't trust yourself around the artifacts (pictured) in the real-life Turner Museum of Glass, perhaps visiting the virtual version is smarter.
(Image courtesy 

The galleries are not limited exclusively to artwork.  Many of the installments in these particular institutions involve animal elements, such as an eagle skull or guillemot eggs.  Dr. Steve Maddock, a member of the university's Computer Science department and one of the program's creators, explained, “Hopefully our art gallery – which explores the relationship between science and art by ‘displaying’ things like our half-specimens as artworks – will pique the interest of visitors and encourage them to make the trip to see the full collections in real life."  

With virtual reality poised to make a major impact on how we see and interpret new things to learn, this could be an important first step in sharing culture worldwide. Could the Met or the Louvre soon follow suit? Will Banksy start writing grafitti electronically? And what happens when someone creates a piece of art that REQUIRES the digital 3D format?  Someday soon, we'll elegantly rendered 3D.

Now you can take a field trip anytime!
(Image courtesy

Unit Testing and Test-Driven Development in Python

There are both advantages and disadvantages to being self-taught in any given discipline. In certain cases, the advantages and disadvantages can overlap or even coincide. For example, when you are self-taught, you are not confined by institutional structures and courses of study. On the one hand, this allows for a distinct measure of freedom to pursue one’s own interests in the field, which would not necessarily be afforded to a person following a traditional disciplinary curriculum. On the other hand, this also means that it can be quite easy to develop gaps in one’s basic knowledge of the discipline, for the simple reason that these areas of study did not fall within your area of interest.

I discovered one such gap in my study of programming in general, and Python in particular, a number of months ago when I came across a quote online that went something like this: “Code that is not tested is broken by definition.”  Testing? “You mean running the code to see if it works?” I thought to myself. Within the next hour I had my first exposure to the method of test-driven development and the Python unittest module.

This was literally the exact opposite of how I had approached my own programming projects up until then, which might be termed “error-driven development”: write some code; run it; see if it works; if it doesn’t work, tinker at random until it does; write some more code and repeat. I quickly realized that, according to the above quote, all my code was broken, by definition. 

The test-driven development model is the reverse of this: write a test, run it and watch it fail; write some code to make the test pass; refactor; write another test and repeat. It was an enlightening experience to attempt writing even a simple program under a test-driven model, as it was immediately obvious that I had only the vaguest notions about things that I thought I knew fairly well.

Since then, I’ve re-written a number of programs I’d created for myself under a completely test-driven developmental model, and have integrated testing into my everyday coding practice. I’ve also collected a bunch of resources that I've found helpful along the way, which you can find below. Also, as you may know, of late there has been something of a controversy brewing on the merit and value of test driven software development. Some links on this are supplied at the end. As always, further recommendations are welcome in the comments!

Overview of Test-Driven Development (Video Lectures)

Unit Testing in Python (Video Lectures)

Python Unittest Module Docs

Python Unittest Intro Tutorials

Test Driven Development in Python

Unit Testing Today

E-Autopsy: Surgical Students Practice On Virtual Cadaver

Medical science has come a long way from stealing executed convicts' corpses in the middle of the night for purposes of anatomical study. Now, prospective scalpel-slingers can practice on a life-sized virtual replica of a human, intricately detailed in 3-D.

The Anamotage table is a composite of CAT scans taken from every conceivable angle and position. Currently in use at the University of Edinburgh, the Anamotage allows student to both remove and replace all of a human being's bones, muscles, organs, veins, arteries, and nerves. A corresponding life-sized, 3-D hologram also joins the state-of-the-art surgery simulation. Bodies can be rotated and viewed along 3 planes for optimum operation.

Professor Gordon Findlater claimed the device had good feedback from students and staff at the university, noting, "Although it will never, I believe, replace the experience of dissecting and handling a real cadaver, it will allow students to handle a virtual cadaver without all the legislation that accompanies the use of a real one."

Let's have a hand for surgical science!
Actual image from the Anamotage.

Telepresence: The Good Kind Of Mind Control

Paralysis used to mean being condemned to a life of immobility. Now, thanks to amazing technological breakthroughs, we not only have the ability to restore the power of motion to human beings, but will soon be able to utilize the same "telepresent" technology to operate robotic elements on other worlds.

This week, for the first time ever, a paralyzed young man was able to have mobility and even a level of dexterity restored to his arm, thanks to a microchip embedded in his brain. The research team, comprised of doctors from Ohio State University's Wexner Medical Center and engineers from the non-profit research center Battelle, had expected their microchip to enable motion in one finger of the paralyzed 23-year old Ian Burkhart. Stunningly, Burkhart was able to not only open and close his entire hand, but was capable of summoning the dexterity to pick up a spoon.

Burkhart had been paralyzed from the chest down for the last four years.

The fascinating new technology that enabled this breakthrough is called the Neurobridge. Starting with a .15-inch-wide chip implanted in the skull, the Neurobridge "reads" thoughts via 96 electrodes and sends them to a sleeve of receptor electrodes on the wearer's limb, travelling via an external skull-socket not unlike the humans' plug-in ports seen in the "Matrix" movies.

Thank to the success, Burkhart's surgeon, Dr. Ali Rezai, told the Telegraph UK, "I do believe there will be a day coming soon when somebody who's got a disability – being a quadriplegic or somebody with a stroke, somebody with any kind of brain injury – can use the power of their mind and by thinking, be able to move their arms or legs.”

The basics of the Neurobridge, as shown by

Outstanding as it is, this may be only the beginning for telepresent technology. Another organization increasingly interested in mind-powered motion is none other than NASA, who feel the technology could be applied to enabling robotic elements for complex tasks in some of the most remote places possible.

At NASA's Jet Propulsion Laboratory in California, a series of experiments for robot-human interfaces have been taking place, aiming to make telepresence a feature of future spaceflight. Currently projects are underway using video-game technology like Xbox Kinect and Oculus Rift to manipulate robotic avatars in virtual reality, with the goal of someday allowing a human to operate them from afar. The head of JPL's Planning Software Systems Group, NASA's Jeff Norris told
"We want to go to a lot of different places. Mars is interesting, and we want to go there very much, but there are so many other places in the solar system. The ability to build a robot that is perfectly suited to a potentially very hazardous environment, that’s going to go swimming in the rains of Saturn, or something like that. The ability to build a robot that is optimized for that task, and then to control it in a way that makes you feel like you are there, to me feels like a very powerful competence. Because, here we are, able to use technologies that make us feel present in that environment, but in a way of inhabiting a robotic avatar that is perfectly attuned to that environment. That’s pretty phenomenal."
Beyond the virtual realm, NASA's plans to make telepresence a facet of full-on "telexploration" are already well underway. Robonaut, the humanoid robot installed on the International Space Station, can be controlled telepresently by human operators on earth, expressing 43 degrees of "freedom" via helmet-mounted units, specialized gloves, and posture-positioning trackers. According to NASA, "The goal of telepresence control is to provide an intuitive, unobtrusive, accurate and low-cost method for tracking operator motions and communicating them to the robotic system."

While NASA's plans for spacecraft and robotic control don't yet include a chip in the brain, it continues to improve on the technology that will make the virtual and actual uses of telepresence more immersive, realistic, and dexterous. New algorithms, camera-based tracking, and magnetic sensors will all add to and improve the ability to manipulate elements like Robonaut or other specialized machinery.

The concept of telepresence has been around in science fiction for as long as the genre has existed, but the term itself was coined in 1980 by MIT professor and robotics engineer Marvin Minsky. He theorized that telepresent robots would, in the 21st century, be critical operational elements for dangerous tasks like mining, the maintenance of oil disasters, or even serious trouble like nuclear reactor meltdowns. In his Omni magazine article "Telepresence: A Manifesto", Minsky states that when faced with the challenge of building "unbreakable" reactor parts (that will eventually someday require repair) versus building with realistic material lifespans that could be fixed via robotic telepresence, "I think the better extreme is to build modular systems that permit periodic inspection, maintenance, and repair. Telepresence would prevent crises before they could arise."

Applying this same reasoning to the space program could keep costs in check while maintaining a high standard of operational capability during missions. As for humans, integrated cranial telepresence could restore "mission capability" to damaged limbs. That does not mean the technology isn't still a little creepy in its formative stages, particularly if one wants to be "emotionally" telepresent.

TELL ME YOUR SECRETS:  the Telenoid wants to talk with you.  Image courtesy Ars Electronica.

The Telenoid, a telepresently-operated robot intended for advanced video conferencing, is able to mimic the eye, mouth, and upper body movements of its user, simulating the major tenets of what humans perceive physically as "emotions." Created by Japanese robotics engineer Dr. Hiroshi Ishiguro, this is an interesting attempt at sharing your feelings with faraway friends. While the Telenoid's pale, spectral presence is still a bit eerie to be considered a good substitute for a human interaction, achievements in android avatar technology in the future may allow for more realistic robotic experiences. While the emotional components of telepresence may still fall short, in the meantime, the physical elements of the technology are now proven to produce results, and disabled humans like Ian Burkhart and others can now hopefully use the technology to at least physically improve themselves.

Telepresence is undoubtedly a fine facet of the future now, and as we continue to map the human brain and unlock its secrets, perhaps externally beaming our thoughts out to our limbs (or those of robots under our command) will surpass many of humanity's previously-known physical limits. Though it seems nearly like movie magic at the present, future developments will branch out abundantly thanks to these current experiments. As Robert Heinlein said when first theorizing about telepresence in his story "Waldo & Magic, Inc.", "Never worry about theory as long as the machinery does what it's supposed to do."

Telepresent Demolition Derby on the moon soon?

After The Automatons: Could A Robot Take Your Job Soon?

With 47 percent of the world's jobs poised to become automated in the next twenty years, what is half of humanity going to do when it is retired by robots?

While creative endeavors and skilled jobs still maintain their value for labor, automated jobs are quickly being phased out by those with the means to reap more capital by building machines to do so. As reports, "last year Google, Apple, Amazon, and Facebook were worth over $1 trillion combined, but employed just 150,000 people." With labor jobs dwindling and information jobs not escalating, what will workers do when their careers and cash all vanish thanks to the rich and their robots?

According to the Oxfam report "Working For The Few", "those richest 85 people across the globe share a combined wealth of £1 [trillion], as much as the poorest 3.5 billion of the world's population." With 85 people controlling the same amount of money as 3.5 billion, it is no surprise that ideas like wealth redistribution and possibly guaranteed minimum income may become serious social issues in the coming years.

How safe is your livelihood in the robot revolution?

Online Learning: An Intensive Bachelor's Level Computer Science Program Curriculum, Part II (Updated - Dec 2020)

Last month, we published a piece providing a basic template for a bachelor’s level computer science curriculum composed entirely from college or university courses that are freely available online. To date, this has been the most popular post on the blog, and we received a ton of great feedback, both positive and negative, in the comments and from around the web.

The original post was based on a learning plan that I had worked out for myself after I jumped into the study of programming and computer science just over a year ago on something of a whim. As I’ve mentioned before, I do not have any formal background in computer science beyond the handful of courses from this list that I have worked through myself. However, I do have years of experience in teaching and in curriculum design for natural and foreign language acquisition at the college level, and consulted the computer science curricula from a number of universities around the country when putting the plan together.

The idea was not to provide a substitute for an actual college or university education (that would typically also require a large amount of alcohol at the very least, which, unfortunately, is not freely available online), but rather to aggregate resources that have been made freely available online from disparate institutions and organize them into the sort of logical structure one would likely find in a general bachelor’s level computer science program.

On the basis of the feedback from that post, we’ve put together a new list of course offerings that covers a lot more ground. In the process, I’ve also loosened up a number of implicit strictures on resources for inclusion in the present listing. For example, some of these courses require registration at a particular website and/or may not yet be available in full (ex. Coursera), a couple others are actually compiled from other resources freely available online (ex. Saylor). But all of them are still free.

Whereas the first post was intended to provide a general overview of the field along with a generic curriculum and necessary resources suitable for an absolute beginner (containing 27 courses altogether), the present listing is much more extensive and intensive in scope representing 72 courses from 30 different institutions. While we have added a number of new introductory level courses, there is a lot more that may be of interest to intermediate level folks and perhaps even some who are highly advanced and are considering a refresher course or two.

The course listing is broken down into three major divisions: Introductory Courses, Core Courses and Intermediate/Advanced Courses.  Individual courses are then listed by category within each division. 

Last but not least, thanks to everyone who provided feedback and offered suggestions on how to improve the original listing. Special thanks to Pablo Torre who provided a ton of links in the comments to the first post, many of which are included here. 

Introductory Courses 

Intro to Computer Science:
Theory of Computation:
Data Structures and Algorithms:

Core Courses 

Algorithms and Data Structures:
Operating Systems:
Computer Programming:
Software Engineering:
Computer Architecture:
Data Management:
Networking and Data Communications:
Cryptography and Security:
Artificial Intelligence:

Intermediate and Advanced Courses

Algorithms and Data Structures:
Software Engineering:
Mobile App Development:
Web Development:
Databases and Data Management:
Artificial Intelligence and Machine Learning:
Natural Language Processing:
Digital Media:
Networking and Communications:
Statistics and Probability:
Leave any suggestions for improvements or additions in the comments!

Block It Like It's Hot: Tetris Still Entertains At 30

The iconic brick-arranging, brainteasing video game classic Tetris turned 30 this week, yet remains a staple for novice to advanced gaming enthusiasts worldwide. First created by Russian engineer Alexey Pajitnov and eventually sold to Nintendo after a messy international battle over the game's rights (Pajitnov, a student at the time of the game's creation, would not see royalties for another 10 years due to his work technically being property of glorious Mother Russia), the beloved game has crossed oceans, language barriers, and gaming interfaces for over a generation.

Pajitnov's game had fascinating societal implications during the dawn of the personal computing age. As he would later tell the Guardian, "Tetris came along early and had a very important role in breaking down ordinary people's inhibitions in front of computers, which were scary objects to non-professionals used to pen and paper. But the fact that something so simple and beautiful could appear on screen destroyed that barrier."

Over fifty takeoffs of the Tetris empire exist, ranging from the sequel (Tetris 2) to Pajitnov's other endeavors (Hatris...a version with hats!) to the more esoteric (Tetripz.) The game's addictive nature has been explained by some psychologists as a means to offer an endlessly-satiating ability of completing small tasks in a neat manner.

Alexey Pajitnov's original Tetris design.

Computer Program Passes Turing Test; Judged As Plausible 13-Year-Old Boy

Like a technological Pinocchio, a computer program called "Eugene Goostman" has convinced researchers that "he" is a real boy.

Ostensibly a 13-year-old boy from the Ukraine, the program was able to pass the Turing Test, which states that a computer could be considered to be "thinking" if it could fool 30 percent of researchers during a five-minute text conversation. First dictated in 1950 by computer pioneer Alan Turing, the test is considered the preeminent benchmark for computational philosophy and artificial intelligence. The Russian-made program, tested by the Royal Society in England, fooled 33 percent of its interrogators.

As reported by the Independent UK, in regards to his success Mr. Goostman stated, “I feel about beating the turing test in quite convenient way. Nothing original."

The success of the test brings about many questions, including many regarding the safety of computer users when dealing with possible cybercriminals. Fortunately the Goostman program has not evolved to a stage of teenage mischief...yet.

UPDATE:  The validity of this article has been proven wrong.  My apologies, that's what I get for trusting the corporate media and their wannabe-robot minds.

Online Learning: A Bachelor's Level Computer Science Program Curriculum (Updated - Dec 2020)

[Update: See also the follow-up post to this piece, An Intensive Bachelor's Level Computer Science Curriculum Program.]

A few months back we took an in-depth look at MIT’s free online Introduction to Computer Science course, and laid out a self-study time table to complete the class within four months, along with a companion post providing learning benchmarks to chart your progress. In the present article, I'll step back and take a much more broad look at com-sci course offerings available for free on the internet, in order to answer a deceptively straightforward question: is it possible to complete the equivalent of a college bachelor’s degree in computer science through college and university courses that are freely available online? And if so, how does one do so?

The former question is more difficult to answer than it may at first appear. There are, of course, tons of resources relating to computer science and engineering, computer programming, software engineering, etc. that can easily be found online with a few simple searches. However, despite this fact, it is very unlikely that you would find a free, basic computer science curriculum offered in one complete package from any given academic source. The reason for this is fairly obvious. Why pay $50,000 a year to go to Harvard, for example, if you could take all the exact same courses online for free?

Yet, this does not mean that all the necessary elements for such a curriculum are not freely accessible. Indeed, today there are undoubtedly more such resources available at the click of a button than any person could get through even in an entire lifetime of study.  The problem is that organizing a series of random lecture courses you find on the internet into a coherent curriculum is actually rather difficult, especially when those courses are offered by different institutions for different reasons and for considerably different programs of study, and so on. Indeed, colleges themselves require massive advisory bureaucracies to help students navigate their way through complicated degree requirements, even though those programs already form a coherent curriculum and course of study. But, still, it’s not impossible to do it yourself, with a little bit of help perhaps.

The present article will therefore attempt to sketch out a generic bachelor’s level curriculum in computer science on the basis of program requirements distilled from a number of different computer science departments at top universities from around the country.  I will then provide links to a set of specific college and university courses that are freely available online which, if taken together, would satisfy the requirements of our generic computer science curriculum.

A Hypothetical Curriculum
So, what are the requirements of our hypothetical computer science program?  Despite overarching similarities, there are actually many differences between courses of study offered at different colleges and universities, especially in computer science.  Some programs are more geared toward electrical engineering and robotics, others toward software development and programming, or toward computer architecture and hardware design, or mathematics and cryptography, or networking and applications, and on and on.  Our curriculum will attempt to integrate courses that would be common to all such programs, while also providing a selection of electives that could function as an introduction to those various concentrations. 

There are essentially four major parts to any bachelor’s level course of study, in any given field: pre-requisites, core requirements, concentration requirements and electives. 

Pre-requisites are what you need to know before you even begin. For many courses of study, there are no pre-requisites, and no specialized prior knowledge is required or presumed on the part of the student, since the introductory core requirements themselves provide students with the requisite knowledge and skills. 

Core requirements are courses that anyone in a given field is required to take, no matter what their specialization or specific areas of interest within the field may be.  These sorts of classes provide a general base-level knowledge of the field that can then be built upon in the study of more advanced and specialized topics.

Concentration requirements are classes that are required as part of a given concentration, focus or specialization within an overall curriculum.  For example, all students who major in computer science at a given university may be required to take two general introductory courses in the field, but students who decide to concentrate on cryptography may be required to take more math classes, while students interested in electrical engineering may take required courses on robotics, while others interested in software development may be required to study programming methodologies and so on.

Finally, electives are courses within the overall curriculum that individuals may decide to take at will, in accordance with their own particular interests.  Some people may prefer to take electives which reenforce sub-fields related to their concentration, while others may elect to sign on for courses that may only be tangentially related to their concentration.

Our hypothetical curriculum will simplify this model. We will assume no prerequisites are necessary other than an interest in learning the material and a basic high school education.  Our curriculum will also not offer any concentration tracks in the traditional sense, as that would require specialized resources that are not within the scope of our current domain.  Instead, our planned curriculum shall provide for introductory courses, general core requirements, and a choice of electives that may also serve as a basis for further concentration studies.

Basic Requirements
A quick survey of curricular requirements for programs in computer science at a number of the country’s top colleges and universities reveals a wide spectrum of possibilities for our proposed curriculum, from a ten course minor in computer science to a twenty-five course intensive major in the field along with an interdisciplinary concentration. (See, for example, MIT, Carnegie Mellon, Berkeley, Stanford and Columbia, or the comp-sci page for a college or university near you.) 

Our proposed curriculum will attempt to stake out a space between those two poles, and aim for a program that consists of about 15 courses: 3 introductory classes, 7 core classes and 5 electives. The required topics and themes of a generic computer science degree program are fairly easy to distill from the comparison: introduction to the field, data structures, algorithms, programming languages, operating systems, networking, data communications, systems engineering, software development, and so on.  Our program will consist of university or college level courses from around the world that cover our basic requirements and are freely available in full online.

Note: I have, unfortunately, not watched every single video from all of the courses below.  However, I have completed three of them in full, viewed a handful lectures from a number of the other courses, and spot checked the videos from the rest for quality. 

Introductory Courses 

Intro to Computer Science, pick two of three:
Basic mathematics, pick one of two:

Core Courses 

Data Structures and Algorithms, pick one of two:
Operating Systems:
Programming Languages and Methodologies:
Computer Architecture:
Data Communications:
Cryptography and Security:


Web Development:
Data Structures:
Programming Languages:
App Development:
Artificial Intelligence:
Leave any suggestions for improvements or additions in the comments!

UPDATE: There has been a ton of great feedback on this post, with suggestions for additions, critiques of the overall form, identification of "glaring holes" and more.  Thanks everyone!  However, rather than address them one by one in the comments, or include them all into an update of some sort, I think I may just begin work on a new version of the piece which provides a more intensive track of study and tries to incorporate as many of those suggestions as possible, assuming that examples of such courses are available for free in full online from a college or university.  So be sure to check back in future!

UPDATE II:  See also the companion post to this piece, An Intensive Bachelor's Level Computer Science Curriculum Program.

Quantum Leap in Quantum Computing?

From Popular Mechanics:
It's a machine that could calculate solutions to problems so impossibly time-consuming that even the most powerful supercomputers could never handle them. And it would do so in an instant. This is the quantum computer, made possible by the bizarre nature of quantum mechanics. And though the idea is still in its infancy, it's no fantasy.

Two research teams, at Harvard University and the Max Planck Institute of Quantum Optics in Germany, have just announced that they have independently forged the building blocks for tomorrow's quantum computers. As they published today in the journal Nature (1, 2), the scientists discovered a way to hook up atoms and particles of light to create a new type of switch and logic-gate‚ quantum versions of the connecting structures that link bits of data in modern computers. 
Amusingly, the author of the article has apparently fielded some questions in the reddit post on the story and provided a rather tight summary:
New computing equipment allows info to be put into a fucking weird quantum state, which can do crazy shit. Like super fast computing. We've made similar things before, but can't build big computers with them. With this we think we can.
See also, this handy infographic:

Online Learning: Three Free Introduction to Computer Science Courses

These days, with a bit of perseverance and discipline, it is entirely possible to receive a world class education in computer science for free online from the comfort of your own home.  Many of the top computer science departments at US universities make their course lectures and materials freely available on the net, providing motivated individuals with a range of choices that is almost unbelievable in its scope.  In this post, we'll take a look a three Introduction to Computer Science courses that have been made freely available online from Harvard, MIT and Stanford.  The Harvard course provides an introduction to C, PHP and JavaScript.  Stanford focuses on Java. And MIT utilizes the Python programming language. 

Harvard's Intensive Introduction to Computer Science
Course site and description:
This free online computer science course is an introduction to the intellectual enterprises of computer science. Topics include algorithms (their design, implementation, and analysis); software development (abstraction, encapsulation, data structures, debugging, and testing); architecture of computers (low-level data representation and instruction processing); computer systems (programming languages, compilers, operating systems, and databases); and computers in the real world (networks, websites, security, forensics, and cryptography). The course teaches students how to think more carefully and how to solve problems more effectively. Problem sets involve extensive programming in C as well as PHP and JavaScript.
Stanford's Introduction to Computer Science and Programming Methodology
Course site and description:
This course is the largest of the introductory programming courses and is one of the largest courses at Stanford. Topics focus on the introduction to the engineering of computer applications emphasizing modern software engineering principles: object-oriented design, decomposition, encapsulation, abstraction, and testing. 
Programming Methodology teaches the widely-used Java programming language along with good software engineering principles. Emphasis is on good programming style and the built-in facilities of the Java language. The course is explicitly designed to appeal to humanists and social scientists as well as hard-core techies. In fact, most Programming Methodology graduates end up majoring outside of the School of Engineering. 
MIT's Introduction to Computer Science and Programming
Course site and description:
This subject is aimed at students with little or no programming experience. It aims to provide students with an understanding of the role computation can play in solving problems. It also aims to help students, regardless of their major, to feel justifiably confident of their ability to write small programs that allow them to accomplish useful goals. The class will use the Python programming language.  Many of the problem sets focus on specific topics, such as virus population dynamics, word games, optimizing routes, or simulating the movement of a Roomba.