When I went to college in 1988-1992, my experience was similar to how many science-oriented students experience college today. I went to classes, wherein a professor stood in front of the room and told us facts and perspectives about their subject. In most biology courses, my role was clear-- listen, take notes, and occasionally (genetics, especially), learn approaches to solving particular kinds of problems. Students asked questions occasionally, but probably over half the class never asked the professor a single question all semester, and no one had the audacity to ask the professor questions after every class. They may have asked the teaching assistant a few questions, but these TAs were often only marginally more familiar with the material than the students they oversaw. Honestly, much (though not all) of this so-called learning involved rote, short-term memorization of facts. If we students got something wrong on an assessment, it was our fault for not having understood the material from its single presentation (irrespective of how well it was presented or not reinforced).
The world is different today. There are very few facts in the world that I cannot find in mere seconds using my computer or smartphone. Virtually everything known is accessible to the world on the internet, though there are also a lot of misunderstandings on the internet portrayed as facts.
As I see it, universities have two potential educational roles in this new era (I'm not addressing research roles here). The first is a service role to the community. Universities have always been the storehouses of knowledge and understanding, but it's both arrogant and short-sighted for universities to perceive this role as exclusive to their students or in-field colleagues, particularly given the amount of public funding that they receive. University faculty can serve their communities and the world by providing or "authenticating" facts, evidence, and diverse perspectives in their study areas through the internet and other media. A university's role in dissemination is not symbolic of arrogance-- university faculty are regularly consulted by the media to interpret new findings or perspectives in their areas given their expertise and training, and this role is merely to be more pro-active. The public's interest for such reliable sources is there-- if one is diagnosed with cancer, would one prefer to just Google "cancer" and look up information on whatever site comes up (perhaps "homecancerremedies.com")? Or would one prefer to get information from the National Cancer Institute of the NIH? Presumably the latter, and this example illustrates the public's perceived value of "authenticated" information. Similarly, if I wanted to learn about genetics or psychology or economics or art history more broadly, I'd love to take a free online course (or "MOOC") from a practitioner who has an advanced degree in the area and was hired by a university as an expert in that area. Neither of these features guarantee that the information will be presented coherently or that the presenter won't be wrong or that there aren't better resources, but it's a safer way to start the road to learning than a random internet search. MOOCs are not the only means of public dissemination, but they are a good one that is both effective and engaging. Freely providing knowledge is not only an important gesture by universities to their communities, but arguably an obligation, and it can also facilitate learning of their on-campus students (see below).
The second role I discuss is the university's primary one-- to help their enrolled students learn. This learning can no longer be rote, short-term memorization of facts-- such "learning" trivializes the role of the university relative to the internet. Instead, we need to engage with students directly, and in a manner that far exceeds what is possible through the internet or free online MOOCs. Our courses need to go beyond fact dissemination-- we need to engage students both individually and in groups to assess how well they are interpreting and applying the concepts we're presenting them. The flipped class is one means of achieving this goal-- students get the primary content in some way outside the class period, and their understanding is assessed. This assessment step is critical-- students learn what elements of the material they didn't correctly interpret or apply the first time, and faculty receive feedback to correct frequent student misinterpretations and misapplications in their presentations. The faculty then spend the class period clarifying areas of confusion directly in response to the student feedback, and then reinforcing true understanding of the material with new problems, applications, and engaging discussions. The format forces faculty and students to interact bidirectionally in the learning process, and this bidirectionality has obvious benefits both to student understanding and faculty teaching strategies. It's also personally satisfying for both parties, as faculty become less "lecturers" and more "facilitators" in the classroom, they work with the humanity of students rather than treating students as consumers of prepackaged products. Relatedly, I've become a firm believer in "open-book" assessments, too, for two reasons-- 1) the world is essentially "open-book" so assessing in a situation where simple facts cannot be quickly checked is (usually) unrealistic, and 2) it forces the faculty member to produce questions that are not merely regurgitation of facts presented in the course, and thus better assess student "understanding" on a higher level.
None of what I've said above is novel or revolutionary. However, many faculty and students are too comfortable with standard lecture formats for our classes (especially in the sciences and social sciences-- less-so in the humanities and interpretive social sciences) and are resistant to changing the roles, particularly given the upfront work involved. While our time is limited, the goal of all universities (both students and faculty) should be to promote the best learning possible, so isn't this worth the investment? Similarly, a lot of people view MOOCs as a threat to our universities-- we're giving away for free what students paid thousands of dollars to receive. Some have said that MOOCs are a means for the "elite" universities to secure their position and displace others by disseminating content possibly (and often incorrectly) perceived to the broader public as "better" than what a good state or liberal arts school may provide. I argue that, if colleges or universities fail to provide opportunities for MUCH more mentoring and learning in their on-campus classes than what happens in topically equivalent MOOCs, they're wasting their students' time and money. MOOCs can educate the public and can be a tool to enhance or supplement available on-campus classes, but they are no replacement for an on-campus undergraduate education should be.
Times have changed, and forward-thinking universities are beginning to change accordingly. It's up to universities and their faculty to keep up with these changing times. If universities don't change quickly, prospective students will soon figure out which schools are least likely to provide a return on their investments...