Monsters University reflects what is really happening.

monsters universityby RJ Bailey

Monsters University: the Aftermath
[Via Crooked Timber]

Monsters University, the prequel to Monsters, Inc, opened this weekend. I brought the kids to see it. As a faculty member at what is generally thought of as America’s most monstrous university, I was naturally interested in seeing how higher education worked in Monstropolis. What sort of pedagogical techniques are in vogue there? Is the flipped classroom all the rage? What’s the structure of the curriculum? These are natural questions to ask of a children’s movie about imaginary creatures. Do I have to say there will be spoilers? Of course there will be spoilers. (But really, if you are the sort of person who would be genuinely upset by having someone reveal a few plot points in Monsters University, I am not sure I have any sympathy for you at all.) As it turned out, while my initial reactions focused on aspects of everyday campus life at MU, my considered reaction is that, as an institution, Monsters University is doomed.


Read the entire post first.

While written in a satirical style, the post holds some real insights. Higher education is being tremendously disrupted and is dealing with that disruption poorly.

Here is the key point that the movie demonstrates (spoilers).

Our heroes are expelled from Monsters University – whose reason for existence is to train and certify employees who can produce the power needed to run their society by inducing fear into children.

They then get a job at Monsters Inc and rise to be the foremost employees at the company, even though they do not bear the proper certifications. Then, in an act of total disruption they discover a new source of almost unlimited power – laughter.

Here is how the post puts it:

The consequences for Monsters University are obvious, and chilling. Two expelled former students have gone on not only to rise to a level of occupational success that ought to be impossible without an MU credential, but have discovered new fundamental facts about the world that completely undermine the knowledge base of Monsters University as an institution. It’s as if Jobs and Wozniak were also Fleischmann and Pons. The School of Scaring, which we hear early on is the “crown jewel” of MU, is now completely outmoded and also, surely, entirely delegitimated.

Why should this institution of higher learning exist if its training is no longer relevant. It has spent all its energies on a School for Scaring. How will it deal with the new disruptive needs for Laughter Learning?

This is a simpler version of what is really happening. Massive Open Online Courses are opening up learning to anyone and producing education with an entirely different system of credentials than a simple BA.

We are seeing High Schoolers who are producing innovations that already have the interest of large companies, perhaps willing to hire their creativity directly without the need for college.

The latest winner of the Intel Science Talent Search actually is developing a new source of energy – algae that produce oil. She did all the work in a homebuilt lab. Another student built a cheap pulsed plasma device, bringing this technology to the masses.

And it is not only from America. At the recent  Intel International Science and Engineering Fair, a student from Romania figured out a way to accomplish for $4000 what cost Google over $70,000.

Either of these kids could simply work for a lot of money or even attarct capital to start their own. All without needing a certificate from a University.

Adaptive Lighting Could Slash Electricity Bills

Adaptive Lighting Could Slash Electricity Bills
[Via Discovery News - Top Stories]

A business card-size device monitors available light and makes adjusts as needed.D5JskoiquGk

According to

A new development from MIT could slash a home’s energy budget in half

. Researchers Matthew Aldrich and Nan Zhao built a system that’s able to monitor available light and adjust it automatically. The setup was made using LEDs, the most efficient form of lights that are commercially available. Unlike compact fluorescent bulbs, LEDs can be adjusted to any level of lighting intensity.

LED lighting lasts for perhaps 50,000 hours and can be dimmed, something compact fluorescents can not be. Now adding in a smart sensor and the lighting can be more efficiently distributed.

This could have a huge effect on energy usage, particularly in commercial settings. According to the original research <>, they can also use multiple LEDs of different colors to modify the color balance depending on the need.

The researchers found that they could see massive reductions in light usage, up to 90 percent. Since LEDs themselves already have one of the best light output to watt used ratios, the overall reduction could be enormous.

And I loved this quote – my bold:

Jeffrey Cassis, the CEO of Philips Color Kinetics, a leading manufacturer of LED lights, says that the team doing this work “is world-class — they are working on a hard problem and a quality, cost-effective solution has great potential.” He says that it is important to use a systems approach, as this team is doing, looking at the whole lighting system rather than just individual components. But he adds that the cost of the finished system, as well as how easily it can be used to retrofit existing lighting systems, will be crucial factors in determining its adoption.

A generational war

 1288 1215052987 96482B2F3B by kentbye
[Crossposted at SpreadingScience]
Social Media vs. Knowledge Management: A Generational War:
[Via Enterprise 2.0 Blog]

You’d think Knowledge Management (KM), that venerable IT-based social engineering discipline which came up with evocative phrases like “community of practice,” “expertise locater,” and “knowledge capture,” would be in the vanguard of the 2.0 revolution. You’d be wrong. Inside organizations and at industry fora today, every other conversation around social media (SM) and Enterprise 2.0 seems to turn into a thinly-veiled skirmish within an industry-wide KM-SM shadow war. I suppose I must be a little dense, because it took not one, not two, but three separate incidents before I realized there was a war on. Here’s what’s going on: KM and SM look very similar on the surface, but are actually radically different at multiple levels, both cultural and technical, and are locked in an undeclared cultural war for the soul of Enterprise 2.0. And the most hilarious part is that most of the combatants don’t even realize they are in a war. They think they are loosely-aligned and working towards the same ends, with some minor differences of emphasis. So let me tell you about this war and how it is shaping up. Hint: I have credible neutral “war correspondent” status because I was born in 1974.


A very clear post that describes the conflict between Boomer and Millennial thinking when it comes to dealing with large amounts of data. Knowledge management (Boomer) is a top-down put the data in the proper bin sort of approach. There are names for each bin and everything needs to fit in the correct one.

Social media (Millennial) uses human social networks in a bottom-up approach that allows the data to determine where it should go. Any bin that it should go into is an emergent property of the network created by the community.

Read the whole post for a nice dissection of what is happening in this War. Just remember that Age is not as important as attitude. There are Boomers who get social media and Millennials who do not.

I think it is that one personality wants things to be black and white (the data is in a database on THIS computer) white the other deals great with shades of gray (the data is in the cloud and not really anyplace).

I did my post-doc in a chemistry lab, the only biologist. I saw something very valuable. Chemistry is very process-driven. The purpose of a process is to reproduce success. If a process, say a particular chemical synthesis, did not work, as in the yield was 10% instead of 90%, it was not the fault of the process. The reagents were bad or the investigator was incompetent. But the process was still valid.

So chemistry selected for people who were very process-driven, wanted things very tightly controlled and well defined.

Biology has a very different regard for process. The same process (say the cloning of a gene) can be done on two different days and get different results (10 colonies of cells one day; 500 the next). Biology is really too complex to be able to control everything. A lot of things can go wrong and it can be really easy to fool oneself with results.

So biology, particularly at the cutting edge, selects for people who can filter out extraneous bits of data, can be comfortable with conditional results and with the general anarchy that can occur. Every molecular biologist has experienced the dreaded ‘everything stops working, so I have to remake every buffer, order new reagents and spend a month trying to figure out what happened, knowing that things will start working again for no real reason.’

Chemists in my post-doc lab hated biology because of the large variance in results, compared to chemistry. Biologists are often happy to be within an order of magnitude of expected results

One way of thinking has to know whether Schrodinger’s cat is dead or alive, while the other is comfortable with knowing it is simultaneously dead and alive.

Biology needs the Millenial approach because it is creating data at too fast a pace to put it all into bins. Social networks can help tremendously with the filters needed to find knowledge in the huge amount of data.

Technorati Tags: ,

Mon, 02 Jun 2003 06:08:46 GMT

RSS’ Growing Importance.

The RSS feed is growing in importance. Two recent comments are an indication of the attention being paid to the fact that RSS may become the new way to dissenimate information.

Writes Jon Udell:

Direct one-click access to RSS sources is suddenly a lot more interesting. It used to be that RSS aggregators were few. Now they are many — because every copy of Radio is one. The people running these aggregators can now start to trade channels as we used to trade links.

The benefits of this new RSS fluidity, which kicks things up a level of abstraction, seem obvious to me, and will seem obvious to anyone who finds their way here to read this. But those benefits will not be obvious to most people. Casual use of ordinary links is still not nearly as prevalent in routine business and personal communication as it ought to be. The kind of meta-linking possible with channel exchange will seem even more exotic. The challenge — and opportunity — is to make all this as easy and natural as most people think email is.

Adds Tim Bray:

Eventually there will be business models built around weblogs, with more popular ones being more lucrative. And while the Pagerank-style ratings produced by Technorati, Daypop and so on are important, the big question is going to become: ?how many subscribers do you have??

[E M E R G I C . o r g]

RSS, aggregators and weblogs have benefits not immediately obvious. As in all paradigm shifts, if you are on one side of the paradigm, you simply can not understand what the person on the other side is talking about. People forget that email was not intuitively obvious for most when it first began to be used. I watched it take 2 years at Immunex before you could be certain that another scientist would read your email at least THAT day so that you would not have to walk down the hall way to ask if they had read your email. We forget just how long it can take for these sorts of empowering technologies to filter through even the most creative and innovative communities. RSS and blogs are even more non-obvious to many. But the power they provide, their ability to increase information flow, will make them extremely useful to those who use them.

Fri, 30 May 2003 21:21:18 GMT

A Little Less Conversation. The whole social software craze is already getting quite a lot of backlash. This article from the BBC reiterates points that many social software critics have, which is that the people working on social software think they’ve discovered something new, when it’s really just the latest version of work that’s been going on for ages. Worse, those designing and promoting social software are accused of ignoring all the lessons people learned before them about human-computer interaction and how computers really play into social interaction. I tend to agree that many people, when they start working on hyped trends, often ignore very important work that’s been done before – but that doesn’t mean the new work isn’t useful as well. Both sides seem to be taking an antagonistic viewpoint on this debate, which is becoming more and more a debate over the semantics of social software, rather than looking at how the software is actually being used. In the end, people will keep using what works, and won’t worry whether it’s officially “social software” or some other term.

What is different this time is that many of the tools being lumps into social software are actually easy for people to use. They allow someone to get up and going in a community without needing to understand just what is happening. Social software is different than knowledge management systems, although both try to solve similar problems. But KM software tends to be monolithic and top down, requiring the user to learn an arcane and non-intuitive viewpoint in order to use the software. Many of the tools from social software and simple enought to be eaily manipulated. The user can find the best one to use, the one that fits their own viewpoint. This is a big, but subtle, difference.

Discussion On Blog Discussions

It is too late for me to write more about this but Tom is right on track. Blogs filter important information and disperse it rapidly in a fashion that has been impossible before. As we get better at using this approach, we will see a larger explosion of its filtering properties.

Sun, 25 May 2003 05:44:00 GMT

Bursty Community Formation in Blogspace. Absolutely fascinating paper on community formation in blogspace, by Ravi Kumar, Prabhakar Raghavan, Jasmine Novak, and Andrew Tomkins, called On the Bursty Evolution of Blogspace. (Free ACM account required — it’s so worth it, just for this article.)

The authors develop a method of measuring time-stamped link-space, so that blogspace can be mapped based not just on links, but links by date, allowing them to track the formation of communities, defined here as a dense cluster of weblogs all pointing back and forth to one another.

Using this method, they put some meat on the bones of what everyone knows:

Within a community of interacting bloggers, a
given topic may become the subject of intense debate for
a period of time, then fade away. These bursts of activity
are typified by heightened hyperlinking amongst the blogs
involved — within a time interval.

They then go on to identify several examples of communities coalescing in a brief period of time around a set of posts — WannaBeGirl’s blog poetry in 2000, or Dawn’s Funniest/Sexiest Blogger poll from 2002. (Unsurprisingly, both examples used posts about other people to get those people’s attention.)

They outline their method for crawling and analysing blogspace while looking for these burst-forming communities, and the algorithm looks like a useful feature for ongoing exploration of blogspace. (Paging David Sifry. David Sifry to the white courtesy telephone…) They also segment blogs by in-bound links:

…pages linked-to by an enormous number of other pages
are too well-known for the type of communities we seek to
discover; so, we summarily remove all pages that contain
more than a certain number of in-links.

in order to differentiate between community participation and publishing (and argument I’ve been groping towards in Communities, Audiences and Scale, and Weblogs, Power Laws and Inequality, but the algorithms here are far more precise than my descriptions.)

Finally, they analyze the changes in their data set overall, and come to two remarkable conclusions: first, 2001, really was the unusual year, with the link structure at both a macro and micro level taking a remarkable jump in density.

Second, there is a core set of blogs that form a Strongly Connected Cluster, and is growing rapidly:

But up to this point, blogspace is not
a coherent entity — the overall size has grown but the interconnectedness
is not significant. At the start of 2001, the
largest component begins to grow in size relative to the rest
of the graph, and by the end of 2001 it contains about 3%
of all nodes. In 2002, however, a threshold behavior arises,
and the size of the component increases dramatically, to over
20% by the present day. This giant component still appears
to be expanding rapidly, doubling in size approximately every
three months. Clearly this growth cannot continue and
must plateau within two years.

Oh, and they prove that blogspace is not a random graph, and conclude that blogspace can better be analyzed as a set of inter-networking communities than as a set of stand-alone blogs.

It’s too early to tell for sure, but this paper feels absolutely seminal. I know its a pain to set up another online account, but do it anyway, and then go read the paper. (Thanks, Hylton) [Corante: Social Software]

I’m setting up my account. A single blog does not work. It is the community of bogs that is important. A idea gets started in a sort of amorphous way and these is intensified and remodeled as it moves through the community until it reaches a tipping point and explodes. We see this again and again. It is what makes weblogs a different medium than any other. It may be why power logs are not that important. We shall see.

Thu, 22 May 2003 05:49:00 GMT

Finding Information in Blogosphere.

Tom Coates applies Duncan Watt’s Small Worlds ideas to blogs and states: “For any given body of information on weblogs, no matter the rate of replication of information or the number of people who post exactly the same comments, close to 100% of the available insight can be reviewed by reading a disproportionately small number of sites – sites that will – as a rule – be among the first that they stumble across through their normal browsing and research patterns.”

[E M E R G I C . o r g]

It seems that in a well connected network, information will get around without the need that every single weblog must be read. Redundancy of information and overlapping spheres of interest will make it work. I think what is more critical is the spread of memes and the development of a tipping point that causes a phase shift in viewpoint. An example may be the NYT archives thread in the blogosphere. This was first discussed over a month ago. Then we recently got a second wave of interest, with a wider dispersal and more forceful presentation of ideas and viewpoints. I expect the next one will be even stronger. The resonance just gets stronger and stronger, like Jimi Hendrix guitar feedback, until things shift. Weblogs are not passive and the ideas they present are not either. It is the interactions of many weblogs that disperse information until knowledge in created. I think too many people view weblogging as a passive event. I put up my ideas and they kind of lay there. But, it the ideas are worthwhile, the get picked up and examined by others. Molded and stamped with their views and then passed on. It is this dynamic process that many people miss. It is what makes weblogs so unique and so powerful.

Thu, 22 May 2003 04:40:53 GMT

Hey, Your Blog Is In My Wiki! No, Your Wiki Is In My Blog!., a New Blog-wiki Service for Collaborative Content Management

“Got notified by e-mail today from a staff member at about their new free blog-wiki (hence, “bloki”) authoring-hosting service. You’ll find my home page at (not much there) and the blog part of my site at Individuals can collaborate on both the blog and the main site pages. The blog includes an RSS feed. The browser-based editor is derived from htmlarea, and features a Microsoft Word-like interface, very similar in fact to the editor used by the WebCrimson service. is powered by (and no, they didn’t pay me to promote their product).” [The Ten Thousand Year Blog]

[The Shifted Librarian]

A combination blog and wiki has some interesting possibilities. I’ll have to check this out.

Spinning around

A nice discussion of the effects of information overload. Many of the problems come from an inability to filter out noise. Better social tools should help with this. I do love one of the quotes because I absolutley believe it is true:

[Gerry] McGovern quotes author Frances Cairncross from his book The Company of the Future: ‘The most widespread revolution in the workplace will come from the rise in collaboration and the decline of hierarchy.’

Collaboration will help us filter informaiton, create knowledge and make better decisions.

Wed, 21 May 2003 14:19:40 GMT

The idiocy in charging for online access to newspaper archives. Dave Winer:
Pfui. The [New York] Times can’t possibly be a factor in Google searches for the simple reason that the Times archive is not accessible to Google. It’s behind a for-pay firewall.

There’s basically a very simple rule. If you want to be in Google, you gotta be on the Web.

The traditional media still don’t understand what the Web is about, sadly. [Jinn of Quality and Risk]

The NYT, and others, make money by keeping the archives closed. And it helps Lexis-Nexis continue their business model. Even if some of them do understand the Web, convincing the money people would be like talking to a brick wall. I just do not see a for profit compnay giving up this revenue stream without something else to replace it. It is often because of this that companies can not deal with disruptive technologies. It takes a lot of courage for a company to remove one source of income before another has replaced it. But it will be the courageous companies that survive in the Information Age, especially if their basic business model is the dispersal of information, like the NYT.

Tue, 20 May 2003 22:38:54 GMT

Your kid is not an empty storage container, ready to be filled with curricular content.

Stories like this creep me out, even if they say Primary school testing and targets are to be streamlined to make exams for seven-year-olds less formal and part of a wider teacher-led assessment yada yada.

Testing programs are not about educating kids. They’re about perpetuating the bell curve. As a kid who spent most of his formative years at the back ends of nearly every bell curve the system could throw at him, and who regarded his school experience as a 13-year prison sentence that commenced at age 5, I can tell you there isn’t a damn thing in any top-down government-mandated educational testing program that answers any kind of market demand from kids themselves ? who are born with extravagantly unique souls, each with its own agenda and an endless series of questions (there’s your real demand) for the purposes of its own education. Few of those questions are addressed by official curricula, testing programs, or even compulsory school attendance.

The unintended agenda of bureaucratized education is laid out very nicely in The Six Lesson Schoolteacher, by John Taylor Gatto in 1991. Dig it.

[The Doc Searls Weblog]

Well, it is an article about England but many of the points are just as true for American public schools. They are all top-down hierarchies that are ill-equipped, in my opinion, to deal effectively with this curent era. Where they work, it is through the botton-up approaches taken by individual teachers. Teachers who are seldom rewarded for their effort. The Six-Lesson Schoolteacher is well worth reading, even if you disagree with it. Modern public education is an outgrowth of the needs of the Industrial Revolution. Standardization is what drove this revolution and these processes were applied to education.We need a new reformulation of public education to deal with the Information Age. I hope we see this in my lifetime. I fear it will be as big a battle as any but the groups that learn how to do this will succeed at a more rapid pace than those that follow old processes. This will, of course, scare the old guard which will react in ways that will only hasten their own demise.

Doc On NYT Arhives

Doc Searls has more on the NYT archive approach. Virginia Postrel, who writes a column for the NYT, has been looking at this also. She has asked her editor for an explanation. I wonder how much more the Times might make if it allowed free access to any subscriber, as most scientific journals do. Virginia quotes from an email I sent her and then adds:

The Times wouldn’t even have to make access free to profit from a freer flow of information. WSJ subscribers can get the whole online edition for $29 a year; the Times seems to think its readers will pay that much to read just 10 of my columns. I’m afraid not.

Information may want to be free but it does not have to be free. That is, restrictions of information flow will be reduced but that does not mean you can not receive payment for your work. The proper price point must be found. As Apple and the iTunes Music Store have demonstrated, people will pay for convenient information flow. In my mind, they are not really paying for the information. They are paying for the convenient delivery of that information. The NYT and other content creators need to find the proper price point. $3 per article is not it.

Tue, 20 May 2003 05:29:33 GMT

As Google Goes, So Goes the Nation. I’ve been debating whether or not this story was worth posting here. The NY Times has a short opinion piece weighing in on the recent debate over whether or not blogs have “too much power” in Google, which is something of a ridiculous debate when you really get to thinking about it. If Google isn’t doing their job, and not providing people with what they want, then it just opens up an opportunity for someone else to provide a better search. Seems pretty straightforward to me. However, the more interesting part of this story are the responses to the NY Times article from folks like Doc Searls and Dave Winer, making a really good point: the popular press seems to be complaining that blogs are outweighing their own stories on certain topics. A large part of the problem, though, is the short-sighted view of the publishers of these press websites that hide their archives behind tollbooths. The simple fact: “If you want to be in Google, you gotta be on the Web.” Indeed. It’s certainly up to those sites and their management over whether or not to open themselves up – but (like so many businesses these days) they seem to be going for short-term direct revenue, rather than realizing the long-term potential for (much greater) indirect revenue by opening up their archives. [Techdirt]

More discussion about the inability of major media to understand just what is going on. People will not find their articles unless they allow them to be indexed by Google. As with anything else, it does no good to have the best writers if no one reads them.

Mon, 19 May 2003 07:06:06 GMT

Visualizing Flows In Social Networks. Then we started to work on bigger social spaces where people communicate in a circle: one speaks to another, and that one to another one, forming a social space called Carousel that is symmetric, but with no poles. Or you have what appears to be the same group of people, but there is one in the centre who is the chief gossip of the group, the one who starts all the conversations.

In technical terms this is still a network, but in social terms the social space is absolutely centrifugal. This is what we call the Petals.

In this combination, one expert, or head of the tribe, transfers all his music expertise to the others, so it is the same as we saw before, but this time with a pole; he’s the head of the tribe, so the social space is asymmetric, and it is called the Crest.

Then we have many communities, each one with the mechanism we saw before: one starts to broadcast to the other one, and everybody broadcasts to everybody else, so we have an overlapping peer-to-peer broadcast and so on. And content is what holds everything together.

This is what we call the Infinite Star, a continuous space that connects infinite points. [Smart Mobs]

There are some really pretty gifs on this page and it includes some wonderful insight into social systems. I think that computing tools only increase the complexity of these systems, making them into something that can not be easily quantified.


Get every new post delivered to your Inbox.

Join 424 other followers

%d bloggers like this: