Coursera Expands to 16 Universities, 3 Int'l.
From the Coursera Blog:
"We are THRILLED to announce that 12 universities—including three international institutions—will be joining Princeton University, Stanford University, University of Michigan, and University of Pennsylvania in offering classes on Coursera.
On Coursera, you will now be able to access world-class courses from:
· California Institute of Technology
· Duke University
· École Polytechnique Federale de Lausanne
· Georgia Institute of Technology
· Johns Hopkins University
· Princeton University
· Rice University
· Stanford University
· University of California, San Francisco
· University of Edinburgh
· University of Illinois at Urbana-Champaign
· University of Michigan
· University of Pennsylvania
· University of Toronto
· University of Virginia
· University of Washington
You’ll be able to choose from more than 100 courses, from Professor Dan Ariely’s course on irrational behavior, to learning how to program in Scala (taught from the creator of Scala, Professor Martin Odersky from EPFL), to the legendary UVA course “How Things Work” with Professor Louis Bloomfield. You can check out the most current course list here — keep in mind you can enroll in a class even if the start date is TBA.
To date, 700,000 students from 190 countries have participated in classes on Coursera, with more than 1.55 million course enrollments total!"
21 Comments:
I took a Coursera course from Princeton on Game Theory. It was a blast.
People shouldn't view it as a replacement for a rigorous education, because a little bit of knowledge is a dangerous thing.
Why not put it on TV too?
Any online learning that allies itself with existing institutions is a waste of time, because existing colleges are largely a waste of time. For example, I was curious about this course on VLSI chip design, only to find out it gets into the nitty-gritty of how modern VLSI CAD software is designed, not the VLSI chip design itself. Building such CAD software is such an esoteric, high-level subject that it is likely to interest almost nobody, but it is easy for the professor to toss up there because he can just regurgitate his own research in front of the class. There might be a small cohort of chip designers who would be interested, more for continuing education, but those people would pay good money for a quality course and are unlikely to waste their time on a free course like this. The problem with current education is the crap that's taught and the mediocrities who teach it. Simply slapping the same shit online doesn't fix that fundamental problem, it just wastes time until online learning really takes off.
Sprewell, if you don't use what you learned and build upon that knowledge or expertise, it may seem school was a waste of time.
For example, my minor was Accounting and I didn't use it much. So, it seems, it may have been waste of time.
Nice that these courses are available, but what matters more than what you know is what people THINK that you know... especially when trying to land a particular job.
If you just want to learn about something, there's always a book out there to learn from
Arbitrage, what you actually know is important, if you want to keep your job or get promoted.
Reading a book is easy. Going through a rigorous program isn't.
Peak Trader (6:01)
My comment applies primarily to the issue of getting a job from people who don't know you... not so much about getting a promotion from those who already know and trust you.
IOW, learning something, and subsequently receiving a degree or certificate that provides proof of learning, MAY have value in the job marketplace.
Knowledge itself, no matter how much, is of little value in getting hired initially.
Peak, the question isn't whether you got a job in your major, the question is whether what is taught has any utility whatsoever. For example, all engineers in my discipline were forced to take a theoretical class that underlied the subject, but that nobody would ever use because nobody worked at that low a level anymore. Further, you would gain essentially no insight from such a math-focused theoretical class, at least nothing that couldn't have been gained by a couple weeks of more superficial overview. But because this theory underlied the discipline and had always been been taught, it was still taught and everybody had to take it. People would put it off till their last semester, because essentially no other class required that knowledge of them, but they had to take it.
So much of what passes for a college education is such filler, subjects that are either utterly useless for the vast majority who take them but are still taught, especially if they did have some utility at one point, decades ago. People wonder why there haven't been as many big innovations lately, this wasted time in college no doubt plays a role in that.
As for your reply to arbitrage, he actually hit the nail on the head. College is just a certification program, which is proven when you get out into the workforce and the content of most degrees is essentially ignored and forgotten in most workplaces. What matters is having that checkmark for bachelor's or some advanced degree on your resume, essentially nobody looks at the content of the degree. And considering that most people choose to go through your favored college programs and don't read the book on their own instead, that suggests the college programs are easier, which is what most people who go through them will tell you, if you ask them. They'll usually say they don't have the discipline to read the book on their own, which admittedly is a different reason than the content itself. But if you're using the same book that the class would use, the material is going to be equally useless no matter how you learn it.
"Why not put it on TV too?"...
At one time back in the late fifties and early sixties some California college did have actual classes (60 minute format 3 times a week very early in the morning) in bookkeeping, accounting, and something called Business English...
Some foundation produced these programs and I saw them run in California, Illinois, and Texas...
They were also broadcast overseas too...
Sprewell says: "forced to take a theoretical class that underlied the subject, but that nobody would ever use because nobody worked at that low a level anymore... People would put it off till their last semester."
Perhaps, that should've been the first class taken, because it's rich in methodology, to help you in subsequent classes.
Also, it's superficial to believe it's a piece of paper that's important rather than actual knowledge gained from completion of a rigorous program.
My dad hired other civil engineers. He didn't care about their education and experience, only what they know and could do today.
Sprewell says: "People wonder why there haven't been as many big innovations lately."
I guess, you missed the Information Revolution, and may miss the Biotech Revolution (i.e. the third and fourth major economic revolutions in human history).
Juandos, perhaps, educational shows can replace much of what's on TV now.
Of course, the potato chip and drug crowd may not like it.
Peak, slapping my head because I already told you that the math-heavy theory class was so useless that many would put it off till their last semester. Yes, it's superficial to just demand a piece of paper, yet that is precisely what most workplaces do. Actually, they are so lazy that they don't even demand the paper to check what's on the resume, so many applicants just lie about their degree, with essentially no consequence. On occasion, they're finally caught years later if they happen to get a high-profile job, like George O'Leary when he got the Notre Dame coaching job or Scott Thompson when he was recently hired as the CEO of Yahoo. Such resume padding is widespread, signifying that the "signal" from the degree itself is essentially worthless. Obviously "actual knowledge" will take you much farther in life, the problem is that most college programs have very little of that and in any case, most workplaces don't know how to check for knowledge. If your dad didn't care about education and experience, then you are making my point that the degree is irrelevant and someone can just learn on their own, likely through online learning. :)
As for the Information Revolution, that was started twenty years ago: I was talking within the last 10-15 years. Essentially all improvements since have been incremental, like shrinking computers down small enough to be held in the palm of your hand, as mobile devices, or improving the range of wireless networks so you can get the same internet everywhere you go. You could call this last decade the decade of the mobile revolution, and that's likely what it will be remembered for in the future, but it didn't take much breakthrough innovation to do it. As for Biotech, don't make me laugh, if that revolution had started, people wouldn't be wondering where it is.
"Juandos, perhaps, educational shows can replace much of what's on TV now"...
You know pt I would like to think that in some small degree it has...
Obviously not to the point of a degreed program but I would hope that programs running on the History Channel, National Geographic, Discovery Channel, Military Channel, Nova, Nature, and Frontline might inspire younger viewers to reach for more and useful education later on in life...
I know back in ths sixties when I was in high school Richard Feynman inspired a lot of people my age group to go to college and study some of the hard sciences...
It was due to Feynman television broadcasts all through the middle sixties...
Sprewell, I guess, you know better than employers and instructors.
It must be those online classes.
The U.S. Biotechnology Industry
"The United States is the largest market and leading consumer of biotechnology products in the world and home to more than 1,300 firms involved in the industry. The U.S. employs more than 60 percent of the worldwide workforce in dedicated biotechnology firms and is home to 70 percent of the world’s research and development, according to a survey by Ernst and Young. In 2008, bioscience research and development in the U.S. was valued at $32 billion (2010).
Peak, yes, yes, I do know better than instructors, much better, but employers? They're basically in my camp if they never actually check if the person got the degree. But yeah, many will stupidly throw out resumes if there's no degree listed on it, so they're not too bright either. So that's your evidence of a biotech revolution, 2008 R&D of $32 billion? Ha ha ha ha, what does that say for your "rigorous college program," that that is the ridiculous evidence you offer? That's less than the revenues of Google last year, one info-tech company. I suggest you actually look at the links I gave you, where it is well known that biotech has been a failure for decades now.
The article continues...
"There are more than 5.5 million scientists, engineers and technicians in the United States, 1.3 million people directly involved in biosciences, and another 5.8 million workers in related industry sectors."
Top 50 Technology R&D Spenders
"Indeed, the correlation between R&D spending and innovation isn't clear. In terms of proportional research spending, Apple ranked last on our list of top R&D spenders, with a 3.2% research and development outlay ($844 million altogether).
Yet nobody would accuse Apple-creator of the iPod and iPhone-of not being innovative, or of not being able to transform its successes into bottom-line results. Apple's profit has grown an average of 62% over its last two fiscal years."
"Microsoft, IBM and Intel accounting for 38% of all research outlays."
Peak, I'm not sure what case you're trying to build with these isolated, unsourced quotes, which you don't accompany with any context of what you intend to communicate. But I will point out that the fact that so much money is still being spent on biosciences R&D perfectly exemplifies my point that it hasn't taken off yet, which is why they're still doing research, as opposed to a company like Apple which is making fat revenues and doesn't have to do as much R&D anymore. That's the point: info-tech has taken off and has fat revenues to show, while biotech has comparatively little revenue, which is why you keep changing the subject to $32 billion in total biotech R&D spending, which is still much smaller than the $125 billion that one info-tech company, Apple, brought in in revenues last year. That's because the info-tech revolution has already taken off, while biotech hasn't.
Of course, biotech is not at the Nasdaq bubble stage yet.
Graduating more microbiologists, biochemists, and other bioscientists will get us there faster.
Post a Comment
<< Home