Type I webinars are a mistake in 2018, and they need to die. We can leave them behind and take our presentations and conversations to other platforms, either Type II or by flipping the webinar. Or we can re-invent, re-use, and reboot Type I. In a time where discussions are more fraught and also more needed, we should do this now.
At a minimum, Facebook has long needed an ombudsman’s office with real teeth and power: an institution within the company that can act as a check on its worst impulses and to protect its users. And it needs a lot more employees whose task is to keep the platform healthier. But what would truly be disruptive and innovative would be for Facebook to alter its business model. Such a change could come from within, or it could be driven by regulations on data retention and opaque, surveillance-based targeting—regulations that would make such practices less profitable or even forbidden.
But the backlash wouldn’t die down. Attempting to respond to the growing outrage, Facebook announced changes. “It’s Time to Make Our Privacy Tools Easier to Find”, the company announced without a hint of irony—or any other kind of hint—that Zuckerberg had promised to do just that in the “coming few weeks” eight full years ago. On the company blog, Facebook’s chief privacy editor wrote that instead of being “spread across nearly 20 different screens” (why were they ever spread all over the place?), the controls would now finally be in one place.
Sadly, this has nothing to do with users or community:
As far as I can tell, not once in his apology tour was Zuckerberg asked what on earth he means when he refers to Facebook’s 2 billion-plus users as “a community” or “the Facebook community.” A community is a set of people with reciprocal rights, powers, and responsibilities. If Facebook really were a community, Zuckerberg would not be able to make so many statements about unilateral decisions he has made—often, as he boasts in many interviews, in defiance of Facebook’s shareholders and various factions of the company’s workforce. Zuckerberg’s decisions are final, since he controls all the voting stock in Facebook, and always will until he decides not to—it’s just the way he has structured the company.
Tim Wu argues that we need to replace Facebook with a trustworthy platform not driven by survelliance and advertising:
If today’s privacy scandals lead us merely to install Facebook as a regulated monopolist, insulated from competition, we will have failed completely. The world does not need an established church of social media.
Notice in this peer evaluation Form below that the students would enter the names of their peers and the project title. This has a high probability of messy data. As a teacher, you will want to sort and filter the peer evaluation results by each student’s project. For this, you will want each students name and project title spelled exactly the same.
In the second of a two part series, Michael Brull looks at the scandal that is wiping billions of dollars in value off the world’s richest company… and it’s about much more than just social media and data mining. British news program on Channel 4 has exposed Cambridge Analytica and Facebook for what has becomeMore
The reason we know about Cambridge Analytica is because of some British investigative reporters posing as Sri Lankans hoping to recruit them for a campaign. That is, our information about what other organisations like Cambridge Analytica do is fragmentary. We don’t know if the Clinton campaign acted similarly. We don’t know how they affected campaigning in Australia. We don’t know if they harvested data on Australians, or sold that data to Australian politicians, or their electoral campaigns.
The reality is, there is no way of truly knowing who is spending what when the information being generated is inserted into the bloodstream of the internet.
The four types of discussion found online can be used to identify the general tendencies individuals have as they communicate, comment, and react in online spaces. An individual may have a series of posts and comments that spread across multiple quadrants as they socialize and participate in online spaces. Yet, wherever there is a large concentration of messages on this model, that identifies the type of communication you generally engage in.
This matrix really has me thinking, especially about different contexts online. For example, with a Twitter chats, when you have different people meeting together with different intents (dialogue vs. debate), how is it that it works? Or does it?
There’s one person responsible.
The time allocated matches what’s needed, not what the calendar app says.
Everyone invited is someone who needs to be there, and no key party is missing.
There’s a default step forward if someone doesn’t come.
There’s no better way to move this forward than to have this meeting.
The desired outcome is clearly stated. The organizer has described what would have to happen for the meeting to be cancelled or to stop midway. “This is what I want to happen,” and if there’s a “yes,” we’re done.
All relevant information, including analysis, is available to all in plenty of time to be reviewed in advance.
All relevant information, including analysis, is available to all in plenty of time to be reviewed in advance.
This is a checklist to come back to regularly.
“There is no such thing as a typical day. Every student’s day is different and no two students have the same timetable.”
I worked at a school that went with a choice based program a few years ago. The problem with it was that it was as old as I was.
Although the students had choice, it was choice over what teacher’s were willing to offer. I guess that would be the next step.
I like the work Greg Miller is doing in this area.
Depending on the purpose of your curation, there are certain tools that may fit your needs better than others. This list has it all! Whether you are curating professional learning resources, planning a lesson, or creating something to share, there’s a tool below that can help you do it!
The activist and internet entrepreneur Maciej Ceglowski once described big data as “a bunch of radioactive, toxic sludge that we don’t know how to handle.” Maybe we should think about Google and Facebook as the new polluters. Their imperative is to grow! They create jobs! They pay taxes, sort of! In the meantime, they’re dumping trillions of units of toxic brain poison into our public-thinking reservoir. Then they mop it up with Wikipedia or send out a message that reads, “We take your privacy seriously.”
McNamee no longer invests in tech companies. “Philosophically, it wasn’t a good fit,” he says. In Facebook, he notes, one can clearly see the impact that certain philosophies have had on corporate culture. “The two most influential people on [Facebook’s] board of directors over the last seven to eight years have been Peter Thiel and Marc Andreessen, both of whom are brilliant men whose economic and political philosophy is deeply libertarian. So in a world where we already prioritize the individual over the collective and we take the Ayn Randian view that none of us are responsible for the downstream consequences of what we do, Mark was surrounded by people who were particularly deep believers in that philosophy, with no contrary voices.”
There’s always the chance that #deleteFacebook will simply serve to deflect criticism away from the dominant ethos of surveillance capitalism by redirecting it at Facebook. Thus, people rage against Facebook instead of the ideology that Facebook shares with many other companies. It’s easy to imagine Google trying to capitalize on the current mayhem at Facebook by using the current frustration as an opportunity to relaunch Google+ (they could create tools that make it easy to import an old Facebook account). But that would just be trading one surveillance capitalism platform for another. And though there are certainly hardcore privacy and crypto advocates who will point to various “secure” services or “really private” alternatives it seems that many such arguments are only a bit better than “no suggestion at all” – especially as (at least as of yet) there still isn’t a genuine alternative to Facebook on offer. Though #deleteFacebook may appear ready to take a bite out of Facebook, it risks being a technological solution that defangs the push for broader systemic change and critique.
There several laws that might plausibly give rise to legal claims against Facebook, Kogan or Cambridge Analytica. Without more information it is difficult to say which of these, if any, might actually lead to a viable legal claim, but each one merits further study. (I am leaving aside for now the potential claims under British and European law, but those add to this list considerably.)
- Computer Fraud and Abuse Act
- State-level Computer Crime Laws
- U.S. Common Law Claims (Contract & Tort)
- Federal Trade Commission Rules
- U.S. Securities Law
School libraries have been called instructional media centres, media centres, information centres, information commons, iCentres, learning labs, learning commons, digital libraries, and cybraries (Farmer, 2017). These terms are in some ways faddish and transitory. ‘Library’, however, has a deep and long tradition associated with it, although the spaces and tools of libraries change over time. Librarians in schools have also had many names, such as teacher librarian, library teacher, library media specialist, library media teacher, cybrarian, information navigator, information specialist, information professional, informationist, and information scientist (Farmer, 2017; Lankes, 2011). Lankes (2011) argues that the terms ‘library’ and ‘librarian’ are entwined with the concept of knowledge and learning. I have said before that those claiming disruption should embrace interrogation of their ideas. Does ‘library’ need to be disrupted, in what ways, and why (or why not)?
- Neutral and democratising;
- Participatory and connected locally and globally;
- Centred around learning, literacy, research, and knowledge; and
- Facilitators of interdisciplinarity.
She also created a tri-venn diagram to represent the contested nature of the space:
I have written about the future of libraries before, however Netolicky’s deep dive takes it a step further.
A plan to mine data from Commonwealth Games visitors who use free high speed wi-fi has been dumped by Gold Coast City Council a day after it was reported by the ABC.
Originally the council was going to require people to use their Facebook login if they wanted fast wi-fi.
Council told the ABC it would collect some data from users’ Facebook pages including their age, nationality and gender.
● 1993-1997: The Information Superhighway
● 1999-2002: The Wild West
● 2003-2007: The Web 2.0 era
● 2008-2012: The Era of the App
● 2013+: The Post-Snowden era
I wonder if this will be another shift?
People passionately argue that there ARE no “wrong answers” when it comes to using technology in teaching and learning. Or they passionately argue that you CAN’T do any of the tasks in the right hand column without the tools listed in the left hand column. Or they passionately argue that by labeling the actions in the left hand column “wrong answers,” I’m hurting people’s feelings and alienating teachers who aren’t quite ready to take kids towards the behaviors listed in the right hand column.
But like it or not, I’ve chosen those words deliberately.
In support of this, he provides three reasons why he stands by his assertion of ‘wrong’:
- It provides a starting points for conversations about the use of technology
- If teachers aren’t looking beyond tools when making instructional choices their decision-making really is flawed
- Not buying the alienation argument
This comes back to his argument that technology makes learning more ‘doable‘.
In simple terms, datafication can be said to refer to ways of seeing, understanding and engaging with the world through digital data. This definition draws attention to how data makes things visible, knowable, and explainable, and thus amenable to some form of action or intervention. However, to be a bit more specific, there are at least ten ways of defining datafication.
- Legally & ethically
This is a good introduction to his book Big Data in Education.
You definitely do need to have two accounts, says Meika Woolard, a 13-year-old with 335,000 Instagram followers. She is one of Australia’s most prominent teen Insta-influencers, and part of a growing trend of users harnessing the power of multiple accounts.
Facebook has been designed to be an information-gathering engine in order to more effectively sell personalized advertising. Its algorithm also attempts to deeply understand your interests in order to “optimize for engagement”: keep you using the site, and therefore viewing those personalized ads, for as long as possible. Its users access Facebook for 50 minutes a day.
In order to gather the most information it can, Facebook has been engineered to be the world’s most efficient peer pressure engine. Users on the platform are constantly being persuaded to stay; those who try and leave report being relentlessly emailed with personalized, emotional content to try and get them to come back.
Tantek Çelik explains this in the IndieWeb Chat:
The big reveal (IMO) of the FB/CA disclosures is that nothing you post to FB is actually “private”, in practice it is silently shared with random apps (that you happen to use your FB ID to sign into), which then are sharing it with other orgs via acquisition or just outright selling your data.
You don’t have to be “protech” or “anti-tech.” Indeed, it’s hard to imagine how someone could realistically be said to be “anti-tech” – your future is going to have more technology in it, so the question isn’t, “Should we use technology?” but rather, “Which technology should we use?”
In 2018, companies from John Deere to GM to Johnson & Johnson use digital locks and abusive license agreements to force you to submit to surveillance and control how you use their products. It’s true that if you don’t pay for the product, you’re the product – but if you’re a farmer who’s just shelled out $500,000 for a new tractor, you’re still the product.
Although it can be easy to blame the fore-fathers, Doctorow highlights how some of these concerns are not new:
The reality is that these early “techno-utopians” were keenly aware of these risks. They founded organizations like the Electronic Frontier Foundation, and the Free Software Foundation, not because they were convinced that everything was going to be great – but because they were worried that everything could be terrible, and also because they saw the potential for things to be better.
He closes his piece explaining that we have to fight for and work with technologists for a better future.
Our technology can make our lives better, can give us more control, can give us more privacy – but only if we force it to live up to its promise. Any path to that better future will involve technologists, because no group of people on earth is better equipped to understand how important it is to get there.
For me, this is why I persist with the IndieWeb.