Monsters University reflects what is really happening.

monsters universityby RJ Bailey

Monsters University: the Aftermath
[Via Crooked Timber]

Monsters University, the prequel to Monsters, Inc, opened this weekend. I brought the kids to see it. As a faculty member at what is generally thought of as America’s most monstrous university, I was naturally interested in seeing how higher education worked in Monstropolis. What sort of pedagogical techniques are in vogue there? Is the flipped classroom all the rage? What’s the structure of the curriculum? These are natural questions to ask of a children’s movie about imaginary creatures. Do I have to say there will be spoilers? Of course there will be spoilers. (But really, if you are the sort of person who would be genuinely upset by having someone reveal a few plot points in Monsters University, I am not sure I have any sympathy for you at all.) As it turned out, while my initial reactions focused on aspects of everyday campus life at MU, my considered reaction is that, as an institution, Monsters University is doomed.

[More]

Read the entire post first.

While written in a satirical style, the post holds some real insights. Higher education is being tremendously disrupted and is dealing with that disruption poorly.

Here is the key point that the movie demonstrates (spoilers).

Our heroes are expelled from Monsters University – whose reason for existence is to train and certify employees who can produce the power needed to run their society by inducing fear into children.

They then get a job at Monsters Inc and rise to be the foremost employees at the company, even though they do not bear the proper certifications. Then, in an act of total disruption they discover a new source of almost unlimited power – laughter.

Here is how the post puts it:

The consequences for Monsters University are obvious, and chilling. Two expelled former students have gone on not only to rise to a level of occupational success that ought to be impossible without an MU credential, but have discovered new fundamental facts about the world that completely undermine the knowledge base of Monsters University as an institution. It’s as if Jobs and Wozniak were also Fleischmann and Pons. The School of Scaring, which we hear early on is the “crown jewel” of MU, is now completely outmoded and also, surely, entirely delegitimated.

Why should this institution of higher learning exist if its training is no longer relevant. It has spent all its energies on a School for Scaring. How will it deal with the new disruptive needs for Laughter Learning?

This is a simpler version of what is really happening. Massive Open Online Courses are opening up learning to anyone and producing education with an entirely different system of credentials than a simple BA.

We are seeing High Schoolers who are producing innovations that already have the interest of large companies, perhaps willing to hire their creativity directly without the need for college.

The latest winner of the Intel Science Talent Search actually is developing a new source of energy – algae that produce oil. She did all the work in a homebuilt lab. Another student built a cheap pulsed plasma device, bringing this technology to the masses.

And it is not only from America. At the recent  Intel International Science and Engineering Fair, a student from Romania figured out a way to accomplish for $4000 what cost Google over $70,000.

Either of these kids could simply work for a lot of money or even attarct capital to start their own. All without needing a certificate from a University.


Adaptive Lighting Could Slash Electricity Bills

Adaptive Lighting Could Slash Electricity Bills
[Via Discovery News - Top Stories]

A business card-size device monitors available light and makes adjusts as needed.D5JskoiquGk

According to

A new development from MIT could slash a home’s energy budget in half

. Researchers Matthew Aldrich and Nan Zhao built a system that’s able to monitor available light and adjust it automatically. The setup was made using LEDs, the most efficient form of lights that are commercially available. Unlike compact fluorescent bulbs, LEDs can be adjusted to any level of lighting intensity.
[More]

LED lighting lasts for perhaps 50,000 hours and can be dimmed, something compact fluorescents can not be. Now adding in a smart sensor and the lighting can be more efficiently distributed.

This could have a huge effect on energy usage, particularly in commercial settings. According to the original research <http://web.mit.edu/newsoffice/2010/adaptive-lighting-1119.html>, they can also use multiple LEDs of different colors to modify the color balance depending on the need.

The researchers found that they could see massive reductions in light usage, up to 90 percent. Since LEDs themselves already have one of the best light output to watt used ratios, the overall reduction could be enormous.

And I loved this quote – my bold:

Jeffrey Cassis, the CEO of Philips Color Kinetics, a leading manufacturer of LED lights, says that the team doing this work “is world-class — they are working on a hard problem and a quality, cost-effective solution has great potential.” He says that it is important to use a systems approach, as this team is doing, looking at the whole lighting system rather than just individual components. But he adds that the cost of the finished system, as well as how easily it can be used to retrofit existing lighting systems, will be crucial factors in determining its adoption.

A generational war

 1288 1215052987 96482B2F3B by kentbye
[Crossposted at SpreadingScience]
Social Media vs. Knowledge Management: A Generational War:
[Via Enterprise 2.0 Blog]

You’d think Knowledge Management (KM), that venerable IT-based social engineering discipline which came up with evocative phrases like “community of practice,” “expertise locater,” and “knowledge capture,” would be in the vanguard of the 2.0 revolution. You’d be wrong. Inside organizations and at industry fora today, every other conversation around social media (SM) and Enterprise 2.0 seems to turn into a thinly-veiled skirmish within an industry-wide KM-SM shadow war. I suppose I must be a little dense, because it took not one, not two, but three separate incidents before I realized there was a war on. Here’s what’s going on: KM and SM look very similar on the surface, but are actually radically different at multiple levels, both cultural and technical, and are locked in an undeclared cultural war for the soul of Enterprise 2.0. And the most hilarious part is that most of the combatants don’t even realize they are in a war. They think they are loosely-aligned and working towards the same ends, with some minor differences of emphasis. So let me tell you about this war and how it is shaping up. Hint: I have credible neutral “war correspondent” status because I was born in 1974.

[More]

A very clear post that describes the conflict between Boomer and Millennial thinking when it comes to dealing with large amounts of data. Knowledge management (Boomer) is a top-down put the data in the proper bin sort of approach. There are names for each bin and everything needs to fit in the correct one.

Social media (Millennial) uses human social networks in a bottom-up approach that allows the data to determine where it should go. Any bin that it should go into is an emergent property of the network created by the community.

Read the whole post for a nice dissection of what is happening in this War. Just remember that Age is not as important as attitude. There are Boomers who get social media and Millennials who do not.

I think it is that one personality wants things to be black and white (the data is in a database on THIS computer) white the other deals great with shades of gray (the data is in the cloud and not really anyplace).

I did my post-doc in a chemistry lab, the only biologist. I saw something very valuable. Chemistry is very process-driven. The purpose of a process is to reproduce success. If a process, say a particular chemical synthesis, did not work, as in the yield was 10% instead of 90%, it was not the fault of the process. The reagents were bad or the investigator was incompetent. But the process was still valid.

So chemistry selected for people who were very process-driven, wanted things very tightly controlled and well defined.

Biology has a very different regard for process. The same process (say the cloning of a gene) can be done on two different days and get different results (10 colonies of cells one day; 500 the next). Biology is really too complex to be able to control everything. A lot of things can go wrong and it can be really easy to fool oneself with results.

So biology, particularly at the cutting edge, selects for people who can filter out extraneous bits of data, can be comfortable with conditional results and with the general anarchy that can occur. Every molecular biologist has experienced the dreaded ‘everything stops working, so I have to remake every buffer, order new reagents and spend a month trying to figure out what happened, knowing that things will start working again for no real reason.’

Chemists in my post-doc lab hated biology because of the large variance in results, compared to chemistry. Biologists are often happy to be within an order of magnitude of expected results

One way of thinking has to know whether Schrodinger’s cat is dead or alive, while the other is comfortable with knowing it is simultaneously dead and alive.

Biology needs the Millenial approach because it is creating data at too fast a pace to put it all into bins. Social networks can help tremendously with the filters needed to find knowledge in the huge amount of data.

Technorati Tags: ,

Mon, 02 Jun 2003 06:08:46 GMT

RSS’ Growing Importance.

The RSS feed is growing in importance. Two recent comments are an indication of the attention being paid to the fact that RSS may become the new way to dissenimate information.

Writes Jon Udell:

Direct one-click access to RSS sources is suddenly a lot more interesting. It used to be that RSS aggregators were few. Now they are many — because every copy of Radio is one. The people running these aggregators can now start to trade channels as we used to trade links.

The benefits of this new RSS fluidity, which kicks things up a level of abstraction, seem obvious to me, and will seem obvious to anyone who finds their way here to read this. But those benefits will not be obvious to most people. Casual use of ordinary links is still not nearly as prevalent in routine business and personal communication as it ought to be. The kind of meta-linking possible with channel exchange will seem even more exotic. The challenge — and opportunity — is to make all this as easy and natural as most people think email is.

Adds Tim Bray:

Eventually there will be business models built around weblogs, with more popular ones being more lucrative. And while the Pagerank-style ratings produced by Technorati, Daypop and so on are important, the big question is going to become: ?how many subscribers do you have??

[E M E R G I C . o r g]

RSS, aggregators and weblogs have benefits not immediately obvious. As in all paradigm shifts, if you are on one side of the paradigm, you simply can not understand what the person on the other side is talking about. People forget that email was not intuitively obvious for most when it first began to be used. I watched it take 2 years at Immunex before you could be certain that another scientist would read your email at least THAT day so that you would not have to walk down the hall way to ask if they had read your email. We forget just how long it can take for these sorts of empowering technologies to filter through even the most creative and innovative communities. RSS and blogs are even more non-obvious to many. But the power they provide, their ability to increase information flow, will make them extremely useful to those who use them.

Fri, 30 May 2003 21:21:18 GMT

A Little Less Conversation. The whole social software craze is already getting quite a lot of backlash. This article from the BBC reiterates points that many social software critics have, which is that the people working on social software think they’ve discovered something new, when it’s really just the latest version of work that’s been going on for ages. Worse, those designing and promoting social software are accused of ignoring all the lessons people learned before them about human-computer interaction and how computers really play into social interaction. I tend to agree that many people, when they start working on hyped trends, often ignore very important work that’s been done before – but that doesn’t mean the new work isn’t useful as well. Both sides seem to be taking an antagonistic viewpoint on this debate, which is becoming more and more a debate over the semantics of social software, rather than looking at how the software is actually being used. In the end, people will keep using what works, and won’t worry whether it’s officially “social software” or some other term.
[Techdirt]

What is different this time is that many of the tools being lumps into social software are actually easy for people to use. They allow someone to get up and going in a community without needing to understand just what is happening. Social software is different than knowledge management systems, although both try to solve similar problems. But KM software tends to be monolithic and top down, requiring the user to learn an arcane and non-intuitive viewpoint in order to use the software. Many of the tools from social software and simple enought to be eaily manipulated. The user can find the best one to use, the one that fits their own viewpoint. This is a big, but subtle, difference.

Discussion On Blog Discussions

It is too late for me to write more about this but Tom is right on track. Blogs filter important information and disperse it rapidly in a fashion that has been impossible before. As we get better at using this approach, we will see a larger explosion of its filtering properties.

Sun, 25 May 2003 05:44:00 GMT

Bursty Community Formation in Blogspace. Absolutely fascinating paper on community formation in blogspace, by Ravi Kumar, Prabhakar Raghavan, Jasmine Novak, and Andrew Tomkins, called On the Bursty Evolution of Blogspace. (Free ACM account required — it’s so worth it, just for this article.)


The authors develop a method of measuring time-stamped link-space, so that blogspace can be mapped based not just on links, but links by date, allowing them to track the formation of communities, defined here as a dense cluster of weblogs all pointing back and forth to one another.

Using this method, they put some meat on the bones of what everyone knows:

Within a community of interacting bloggers, a
given topic may become the subject of intense debate for
a period of time, then fade away. These bursts of activity
are typified by heightened hyperlinking amongst the blogs
involved — within a time interval.

They then go on to identify several examples of communities coalescing in a brief period of time around a set of posts — WannaBeGirl’s blog poetry in 2000, or Dawn’s Funniest/Sexiest Blogger poll from 2002. (Unsurprisingly, both examples used posts about other people to get those people’s attention.)

They outline their method for crawling and analysing blogspace while looking for these burst-forming communities, and the algorithm looks like a useful feature for ongoing exploration of blogspace. (Paging David Sifry. David Sifry to the white courtesy telephone…) They also segment blogs by in-bound links:

…pages linked-to by an enormous number of other pages
are too well-known for the type of communities we seek to
discover; so, we summarily remove all pages that contain
more than a certain number of in-links.

in order to differentiate between community participation and publishing (and argument I’ve been groping towards in Communities, Audiences and Scale, and Weblogs, Power Laws and Inequality, but the algorithms here are far more precise than my descriptions.)

Finally, they analyze the changes in their data set overall, and come to two remarkable conclusions: first, 2001, really was the unusual year, with the link structure at both a macro and micro level taking a remarkable jump in density.


Second, there is a core set of blogs that form a Strongly Connected Cluster, and is growing rapidly:

But up to this point, blogspace is not
a coherent entity — the overall size has grown but the interconnectedness
is not significant. At the start of 2001, the
largest component begins to grow in size relative to the rest
of the graph, and by the end of 2001 it contains about 3%
of all nodes. In 2002, however, a threshold behavior arises,
and the size of the component increases dramatically, to over
20% by the present day. This giant component still appears
to be expanding rapidly, doubling in size approximately every
three months. Clearly this growth cannot continue and
must plateau within two years.

Oh, and they prove that blogspace is not a random graph, and conclude that blogspace can better be analyzed as a set of inter-networking communities than as a set of stand-alone blogs.

It’s too early to tell for sure, but this paper feels absolutely seminal. I know its a pain to set up another online account, but do it anyway, and then go read the paper. (Thanks, Hylton) [Corante: Social Software]

I’m setting up my account. A single blog does not work. It is the community of bogs that is important. A idea gets started in a sort of amorphous way and these is intensified and remodeled as it moves through the community until it reaches a tipping point and explodes. We see this again and again. It is what makes weblogs a different medium than any other. It may be why power logs are not that important. We shall see.

Follow

Get every new post delivered to your Inbox.

Join 449 other followers

%d bloggers like this: