Ed Tech Vox Populi

Here’s what I am working on.

A serialized audio version of Martin Wellers’ book 25 Years of Ed Tech featuring volunteer contributions from 25 different people in education, edtech, open education.

Martin doesn’t know this, but he planted the seeds of this project with me long before he published the book. Each year, Martin writes a blog post recapping the books he reads and I have always been in awe of his prodigious annual book consumption; 93+ last year! I am lucky if I get through 2.

I asked Martin how he managed to do this. His secret? Audiobooks. So on his recommendation, I decided to give the format a try and discovered 3 things;

  1. It makes a big difference in how quickly I can get through a book. I am still nowhere near 100 like Martin, but am seeing a big uptick in the amount of reading I am doing.
  2. I really enjoy the format. I mean, the full cast version of American Gods by Neil Gaiman? Epic. Many of you know I used to have a radio career so have an affinity for audio, despite getting burned out by doing it for a living. Honestly, the old “do what you love and you’ll never work a day in your life” axiom doesn’t really fly with me. But….well, another post.
  3. It has really expanded the physical spaces I read/listen in. Run on the treadmill? In goes the audiobook. Walk Tanner? In goes an audiobook. No longer is my book reading limited to the 5 minutes in bed before I fall assleep, which used to be the case.

Thanks to Martin, I’m hooked and a fan of the format now, too.

So, when his book was nearing the final stages of being (openly) published by Athabasca Press, I thought, “Well, for the huge fan of audiobooks that Martin is, his own book isn’t available as an audiobook. That’s a shame.”

Wait a sec. I have a radio background. I’ve done audio work before. I wonder how much work it would be to do an audiobook version of it? Hmmmm…..

Light. Bulb.

I emailed Martin and said, “Hey, I have an idea….” To which he immediately sent me a PDF advanced copy of the book so I could test out how much effort might be involved with creating an audio version of the book.

But then I thought, “You know, I am not the only person with audio experience in my network. There are a ton of wonderful educators who do podcasts. I wonder if they might want to participate?”

So I started sending out some exploratory messages to people in my network whom I thought might be interested in taking part. And before I knew it, I had 25 people lined up each to read a chapter. I am sure I could have easily got 25 more to participate with the enthusiastic response I had (and I am sorry to the dozens of other people whom I had on my list to contact, but ran out of chapters).

I was also trying to be mindful of ensuring that we had a diversity of voices participating. And by voices, I mean that in the literal sense of the word as the overall narrative voice is, of course, Martin’s. It is his book and these are his words, which makes it an odd sort of thing when you are asking people to read someone else’s words in something so intimate as their own voice.

Maha Bali was the first to notice this. Shortly after she test read her chapter she emailed to tell me that she felt the urge to want to comment on the contents of her chapter. However, we are adhering closely to the No Derivatives restriction on the book (more on this in a moment), so our readings need to be word for word of the original. But it was Maha who first tossed out the idea that maybe there could be some way to have a discussion about each chapter that was separate from the audio version of the book.

And into the project comes Laura

Laura Pasquini was also someone I had asked to participate, knowing that she was an active podcaster. She picked up on Maha’s comments.

“Hey Clint. What do you think about doing a podcast that is kind of like a book club where we could invite people to discuss the chapters? Oh, and by the way, I have a pro account for Transistor and I would be happy to host the podcast and can take care of setting up all the feeds and such for syndication.”

Um….yes please!

So the plan is to release the book as a serialized podcast with one chapter released every Monday read by a different volunteer narrator and then release a second podcast on Thursday which is the discussion of the chapter. I am gathering the book chapters and Laura is producing the supplemental podcast. Both will be pushed out on the same feed. It will soon be available via the podcast tool of your choice, but for now, if you know how to manually set up a podcast subscription you can use this RSS feed.

Open Win

I alluded to the copyright license earlier, but want to hit on it here explicitly as the open license that Martin and Athabasca Press have released his book under (CC-BY-NC-ND) plays an incredibly important role in making a project like this possible, and the possibility lies in a nuanced detail of the Creative Commons license that may not be obvious when you see a -ND restriction.

At first blush, many might think that the -ND clause would restrict this type of activity from happening. However, what we are doing in this project is something called format-shifting – moving from print to audio, one format to another. We are being very careful not to alter the actual words in the book which would then start to veer into adaptation/derivative territory.

But format-shifting is allowed even with an -ND license. All Creative Commons licenses allow format-shifting. So while it may seem like the -ND is restrictive, it is still flexible enough to allow us to redo the book in audio form and redistribute as a weekly podcast without having to ask for permission ahead of time*, which illustrates the value that ANY CC license can bring to a piece of content vs. an All Rights Reserved copyright.

I should also note that the artwork we are using by Bryan Mathers is also CC licensed, as is the bg music for the podcast. Open wins all around.

The launch date is November 4th and you can see a full list of all the wonderful people who are participating on the podcast website. Martin has also written about the project.

* side note: even though CC licenses don’t require asking for permission, I do like to keep people in the loop when their stuff is involved and I have been in regular contact with Athabasca Press & Martin and they have been very supportive.


Domo Arigato EdTech Roboto

Back in the days P.C. (Pre-COVID) I was starting to do a deeper dive into the world of Mastodon and set up an instance to play around with. One of the things I did was build a bot that automatically posts blog posts from edtech bloggers to Mastodon.

I’m going to try to CogDog walk through the process as much as I can here in this post, hazy memories and all as I did this work 6-9 months ago. But to start, in true CogDog style, the live demo or, if you prefer, a screenshot below of what the EdTech bot Mastodon account looks like if you don’t want to click away just yet.

Screenshot of a Mastodon social media account showing the home page of the EdTech bot

What you are looking at in the screenshot is the homepage of the EdTech bot built on a Mastodon instance hosted on Cloudron by the OpenETC. The bot is being fed blog posts from an aggregated list of edtech blogs from my Inoreader account which I have connected to Mastodon using an IFTTT recipe. What this bot does is watch my Inoreader blog list for a new blog post from one of the edtech bloggers on my list. When it sees a new post, it automatically posts it to Mastodon.

Holy moving parts, Batman. I’m going to do my best to break this down.

But first, some Mastodon context

Now, I do have a Mastodon user account on the mastodon.social server where I cross-post content from that account to Twitter as a way to begin stepping away from relying so heavily on Twitter without losing my network (you can read some of my rationale in a post I wrote for OER20). Mastodon.social is the instance of Mastodon maintained by the developer of Mastodon.

But Mastodon is an open-source application, meaning I don’t have to rely on the mastodon.social instance. I can actually set up my own Mastodon server and run my own Twitter-like service. I can keep it closed to just a few people, or open it up and connect it to other Mastodon instances creating a wider social network.

I do think that federated models that can be controlled at the local level, but still connected to other instances, is a good alternative to commercial social media platforms, especially when they are hosted and controlled at the community level.

I get that people are still hesitant to take the plunge – re-creating networks are difficult. So part of my experimenting was to see if I could find ways to make Mastodon a bit more useful for others who may eventually find their way to Mastodon. One of the value adds I wanted to see if I could do was build a Mastodon bot that could provide a source of relevant, updated content for educational technologists on Mastodon.

The dark side of bots

Now, before I go further, I should acknowledge that there is a definite dark side to building bots for social media sites. Much misinformation and disinformation being spread on social media today comes from bots. Which, in itself, is a pretty good reason to learn how to build bots as a way to understand how they work and how they can be used to spread misinformation.

That said, I also think bots can be useful in a trusted network. In my case, as a way to provide a useful account for others interested in edtech to follow in order to find some usefulness on a new platform where their network is currently not at. I thought a bot that posts curated blog posts might be a way to bring an edtech network into the Mastodon world where new people using the service only had to follow an account to connect with a relevant network via their blog posts. That was the rationale behind my experiment to build a useful bot.

You might be asking yourself, “so, aren’t you worried about others building a bot that may be malicious?” Here is the great thing about running your own software – you control the environment and who you want to let into your instance. Don’t want your instance to be used to build bots? Lock it down. Restrict registration to a trusted group of people or even just yourself. Run it like you do a WordPress blog if you wish. No one else can create an account on your Mastodon instance, but you can still interact with the rest of the Mastodon world because of the federated aspect of the application. Lock it up…

Screen shot showing nobody can sign up

And only invite those on to your instance whom you trust.

Screenshot showing admin field that allows administrators of mastodon to invite select people in

Now, this does not prevent the creation of nasty bots on other Mastodon instances. But by controlling your own environment you do limit and control bot creation on your own instance, which is a heck of a lot more bot control than you get on open public platforms like Twitter.

Ok, onto the steps.

Step 1: Mastodon

So, the first step was setting up my own instance of Mastodon, which I did via the OpenETC. Grant Potter has been experimenting with an application deployment framework called Cloudron which makes deploying web-based applications fairly straightforward.

One of the applications in Cloudron is Mastodon, so with a click and a subdomain, I was able to get a sandbox instance of Mastodon up configured & running on the OpenETC.

Screenshot showing the homepage of the OpenETC test mastodon instance

I spent a few weeks of spare time poking around and figuring out the administration side of the instance and invited a few people in my network to create some test accounts to give it a try. I’m glossing over a lot of the details of the set-up and configuration here, but that is a blog post on its own and I want to get to the bot building part.

Step 2: Make the bot

Once I had a local instance of Mastodon up and running, I needed to create a new account on the instance that would be my bot. Once the account was created, I logged in to the new bot account and went into the preferences screen where I customized the bot page to make it clear to people that this was a bot and what the bot did. Within the account preferences, there is a toggle switch that designates the account as a bot.

screenshot showing the bot toggle

This toggles a notification on the bot profile page that lets others know that this is an automated bot account.

screenshot of bot profile page letting others know this account is a bot

Step 3: Feed the bot with Inoreader

Around this time I was also switching my RSS reader from Feedly to Inoreader after hearing Laura Gibbs rave about it on Twitter (it’s a very good product and I am happy I listened to Laura). One of the features of Inoreader is the ability to generate a single RSS feed from a collection of RSS feeds. It’s a wonderful way to aggregate a lot of information sources and share as a single feed. In Inoreader I set up a collection of blogs from about 20 or so ed tech bloggers I read on a regular basis and called the collection EdTech Blogs. From there I am able to get a single RSS feed for all those blogs.

animated gif showing how to access a single RSS feed from a collection

Step 4 Connect the do…um, bot using an IFTTT webhook

Ok, the bot account has been created. I have a source feed. Now I just need to connect the source RSS feed to the bot account and let it do its thing.

To do this, I turned to teh interwebs and they delivered this wonderful blog post that explained, in detail, how to set up a webhook to connect the bot to Inoreader through IFTTT. At a high level (you can read the blog post for the specific details), you log into your bot account on Mastodon and create an application via the development tab.

Once you create your new application in Mastodon, it will generate an access token that you need in order for IFTTT to send the blog posts from Inoreader to the correct account on your Mastodon instance.

Once you have that token, go over to IFTTT and create a webhook applet. Enter in the Inoreader RSS feed you want to monitor, the Mastodon instance url and access token, and set up a few parameters on what you want to be included in the Mastodon post. In this case, I have the blog title, blog author, and link back to the original post.

Which results in a Mastodon post from the EdTech bot that looks like this one from JR Dingwall

There are some things I can’t control with the bot that I wish I could, like the author names so that people can tell at a glance who wrote the blog post. However, the bot can only post the info it is given, and there are quite a few people who attribute their blog posts to a generic “admin” account so you end seeing posts attributed to admin. And if the person creating the blog post does not have an image associated with the post, you end up with a generic looking page document in the image placeholder. But if they do add an image, the bot does pick it up and adds it to the post, like Alan’s example here

I also wish that I had some more programming chops to be able to write a script that does what IFTTT does as the middleware piece and not have to rely on IFTTT. But still, a fun project that will hopefully pay off someday and make it a bit more enticing for some edtech to check in on their long-dormant Mastodon account knowing that there is a bot in the bg providing fresh relevant content for them.

If you have a Mastodon account and want to follow the bot, hit it up here, at least for the near future.


Could a Canadian MOOC provider have helped higher ed this fall?

Efforts appear to be well underway at most Canadian post-secondary institutions to offer a good bulk of their courses online this fall in response to COVID-19 concerns. Many institutions in Canada signalled their intent early on in the pandemic and, from my own personal experiences, I know a few who have used the early announcement to hire on extra staff & bolster their online technology to accommodate the surge of online learning that will be occurring in the next few weeks.

I don’t know what the summer has been like for institutions preparing for the fall. I can only imagine based on my own personal interactions and the kinds of conversations I am hearing & seeing in my network on topics like academic burnout that this summer has been incredibly difficult for those who support teaching, learning & educational technology at their institutions. I wish I could say things are about to get better, but my own personal experience tells me that the time from Mid-August when instructors begin to return from their summer hiatus (those who were lucky enough to have one this summer) until the end of September is usually the most insane 6-8 weeks of the year.

The Canadian post-secondary system is about to undergo a massive surge in online learning that will eclipse the spring, and I suspect that over the summer a lot of energy at individual institutions has been poured into the development of high enrollment first and second year online courses. A lot of new “Introduction to…” courses will be coming online this fall, all slight variations of one another. Which, systematically, represents a lot of redundant work.

In the spring, sensing this massive duplication effort coming, Alex Usher floated an idea around the collaborative development of a core set of shareable high enrollment first-year courses. Alex’s idea represents a more systematic approach to developing online courses to help alleviate the development and delivery pressure many institutions will be feeling acutely over the next few weeks. It is a laudable idea. Indeed, it is one that has found some purchase within my own organization where a project is currently underway to identify OER content that could be used to develop Open Courseware (OCW) for high enrollment courses.

However, as years of experience have taught many of us who have worked on OCW-type projects, the road to OCW is not an easy one. The idea of reshareable course content is one that has driven the OER side of open education for the better part of 20 years and, while there has been a great deal of success in open education since OER’s first began, the idea of OCW has had limited results. I won’t get into the myriad of technical, cultural, pedagogical, administrative hurdles that exist (some of which were touched upon by George Veletsianos in his response to Alex’s original post), but they are not insignificant.

This makes me wonder about additional ways in which the Canadian post-secondary system can collaborate and scale up online learning that was not such a massive duplication of effort, and I began to wonder about MOOC’s and MOOC providers. Might a centralized MOOC provider in Canada have given institutions another way to prepare for the fall?

If there was a publically-funded (important imo) national MOOC provider in Canada, say in the form of an inter-provincial institutional consortium, could that have been in a position to help the wider post-secondary system respond more nimbly to the fall onslaught by making highly enrolled first-year courses available broadly and at scale? Instead of developing course content to be copied and moved from one institutional platform to another – which is still a significant hurdle in 2020 – why not a centralized learning platform that is open to thousands of students from any institution that would be able to scale to handle a massive influx of online learners all looking for those same foundational courses?

Not that MOOC’s are the be-all and end-all. But they have shown that they can play a role in the delivery of scalable online learning, which is a need that all Canadian post-secondary institutions have right now, and might have for the foreseeable future. Say what you will about MOOC’s, but they are built for scale and, this fall, online learning is going to need to scale.

All this Monday morning quarterbacking also got me wondering as to why a national MOOC provider has yet to emerge in Canada, like in the US or the UK? It seems to me that, regardless of the COVID online demand, there is an appetite among Canadian institutions for a MOOC service judging from the number of publically funded post-secondary institutions I see who have partnered with for-profit and non-profit providers in the US.

There are likely a lot of reasons why one has not emerged in the publically funded landscape, not the least of which has to do with post-secondary education being a provincial, not a federal, responsibility in Canada. When you are talking about projects like MOOC’s that rely on the scale to be successful, that would likely require a national effort. But perhaps the timing is right to begin to ask the question of whether there is a role in the Canadian higher education landscape for a publically funded MOOC provider, what a Canadian MOOC provider might look like, and why nothing has emerged in the years since MOOC’s moved into the mainstream?


Tweaking my work from home gear

Prior to COVID, my work schedule was 3 days at home and 2 days in the office, so when COVID hit I was in a pretty good space to transition to full-time work at home. A few years back I purchased an inexpensive convertible desk from Ikea, modified it a bit by adding a shelf to elevate a larger monitor, and invested in a decent headset for video conferences and remote presentations. It was a functional setup that served me well when I divided time between home and office, and for a time when virtual meetings and webinars were occasional, not multiple times daily, events.

But then 2 things happened that has made me want to up my virtual work at home toolset.

First, COVID hit and, like many other organizations, BCcampus went fully virtual, as did every institution in the province with whom I work closely with. Second, I picked up another sessional online teaching position at the University of Victoria to go along with the sessional online teaching work I do at Royal Roads University. So not only has the amount of time I spend online in BigBlueCollaborateZoomTeams meetings increased, but I am also spending more time creating videos to help add a sense of presence to the online courses I teach. The more time I spent doing both of these things, the more I began to notice a few things about my setup that could use some tweaking.

For one, my audio. I have a background in radio production and broadcasting, and as much as the headset I had was a step-up from earbuds (I honestly don’t know how you Apple folx can spend hours with those things in your ears day in and day out), the audio quality was starting to bother me in the videos I produced for learners. And after making a return appearance to ds106 radio a few weeks back with Maren, Anne-Marie and Tannis, I have in the back of my mind I want to pick up some more DJ shifts on the freeform station. So, I invested in a decent microphone, stand, shock cage and a good old fashioned set of over the ear comfortable headphones.

Microphone and headphones with computer

My audio setup is now;

  • Audio-Technica ATR2100x-USB Cardioid Dynamic Microphone ($150) I went with this one as it seemed to balance affordability with quality. I wanted a cardioid mic for the directional pickup pattern as the room I am in is fairly noisy and I hoped that going with a cardioid mic it might help reduce the amount of bg noise. It is very directional, meaning I need to have it fairly close to my mouth for it to work well, so it does now make appearances in my videos. This mic is also both USB-C and XLR ready, so if I want to plug into production-quality gear I have a universal XLR plug that can get me hooked up. It is also portable, so I can take it on the road with me if I want. And I can plug headphones into the mic which gives me a much more immediate and real sense of what the mic is “hearing” at the time of recording, meaning it is easier to catch distracting background noises. If I would have had this a few weeks back when I guested on ds106, for example, I would have immediately noticed that the wind in the bg was being picked up my the mic I was using. Instead, we went the whole show without me catching that basic audio gaff.
  • Mic stand ($30) with shock mount cage ($10) & sock ($5 for 4 pack) The mic stand attaches to the side of my desk, so without the cage to absorb vibrations, every touch of the desk would be picked up by the mic. The sock filters out things like popping p’s and wind noise. Cheap, but does the job for the time being. But I can see where repeated moving it in and out of position on my desk is going to cause it to lose it’s tensile strength pretty quickly.
  • Sennheiser HD201 audio headphones ($100) I went with these because they are lightweight, comfortable and sound very good. They are not noise cancelling but isolate enough of the audio around me that I can hear what the mic picks up. And they are comfortable to wear for hours at a time.

On the video side, my office is a very bright room with large windows on three sides. Working in tons of natural light is a joy, and I have a large window in front of me that really helps with front-facing light (an important part of good video is good front lighting) but not when you have a large window directly behind you that can make lighting for video a real challenge as the backlight from the window behind me sometimes caused me to appear dark on-screen, or was overly intense in the bg. And the blinds we have in this room are butt ugly and battered, so when they are closed I am conscious of every bent and battered blind in camera view behind me.

Man pointing at sunny window behind himOh yes, the view behind me. Something that I really never cared much about in the past, but perhaps I should now as people seem to be paying attention to that. Well, right behind me is my wife’s office workspace and she is not keen on making on-camera appearances while I am in meetings or creating videos. Because I am on camera so much she was avoiding using her own workspace. Also right behind me, our treadmill which, like the blinds, was making me increasingly self-conscious about having in over my shoulder in the shot.

Man pointing at treadmill behind him over his right shoulder

Now, tools like Zoom and Teams have virtual backgrounds and I tried those, but was never happy with the results, especially when used with my new mic which takes up a bit of screen space. I ended up with odd effects using the background option.

Zoom call where fingers are mysteriosuly missing from particiapnts hand

Zoom caller with hunk missing from shoulderZoom caller missing arm caused by bad Zoom virtual bgThe virtual backgrounds weren’t really cutting it. So I thought about hanging some kind of curtain from the ceiling to help with the issues. But right above my desk is a ceiling fan so hanging things not really an option. Besides, I’ve moved my office in the house a few times and may want to do it again, so the idea of having a portable background was appealing.

Then came a tweet from Doug Belshaw a few weeks ago talking about a portable green screen he had purchased from Elgato. I ordered one from Best Buy a few weeks ago ($250) and it finally arrived yesterday, and it works like a charm. Here it is sitting just behind my chair, hiding the backlight and treadmill. I have set it up quite close to the chair for the photo, but it would normally sit back a foot or two to give me some space.

Desk with green screen behind it

And a shot from behind where you can see the skeleton of the setup.


Photo pf the back of a green screen

And here is what it hides behind me.

Photo of treadmill and desk

The green screen feels like a solid, well-built unit. The fabric is thick, the hardware solid, and setup is a snap.

Open the storage box and lift. Push down back into the box when done and prop it in the corner. Set-up and takedown is 30 seconds. And, being portable, I can take it to any location.

Tall carrying case for green screen

Packed up and stored tucked away in a corner of the office.

Today I used it for the first time in a meeting and was super happy with the results. No light bleeding through causing a weird halo around my head and sharp, crisp lines. And no treadmill in the background.

Man in a Zoom

All in all, an investment of about $500, which, while not insignificant, does feel like an investment in making my home a more comfortable work environment, while increasing the audio and video quality of my presence for both synchronous meetings and facilitated learning experiences, and for the creation of media artifacts that I’ll use in my teaching & learning practice.

Up next – a webcam that I can mount at eye level.


Using Mattermost as class hub

Over that the OpenETC site, Tannis has posted on some interesting ways that the open tools of the OpenETC are being used to support teaching & learning. It reminded me that I am looooong overdue posting about my own use of Mattermost last fall with students in my LRNT 528 course Facilitating in Digital Learning Environments.

As I mentioned in a blog post last summer, I wanted to try an IM-like chat tool for a number of reasons. First, after using these types of IM tools myself for years, the conversations seem to be more free flowing than occur in a standard Moodle discussion forum. Conversations in chat tools feel more like conversations. A bit more spontaneous and natural, and I wanted to see if a change in technology could bring that same natural energy to class discussions. Second, chat tools blur the lines between synchronous and asynchronous communication and can make it easier to have a spontaneous chat sessions while still giving students who prefer the time and space afforded by asynchronous the opportunity to respond on their own time. Third, chat tools have better support than most LMS discussion forums for more diverse methods of communication. GIF’s and emoticons are easy reaction tools that can help people create social presence within a learning environment. Finally, IM tools are increasingly common ways of collaborating and communicating on the web, and for this particular group of learners studying digital facilitation in online learning environments it felt like an important tool for them to use at some point in their academic career as I can see these types of platforms becoming increasingly more important in digital facilitation.

Slack use in class seems to be increasingly common, but after their “mistake” last year where a number of user accounts (including academics & students located here in British Columbia) were deactivated for seemingly political reasons, I was not in a hurry to outsource my facilitation to them, especially when there was a viable open source alternative, Mattermost, being hosted here in BC by the OpenETC. This was my overarching reason to use Mattermost over Slack. So I created a team in Mattermost for my LRNT528 Digital Facilitation class in the hopes of doing a direct replacement of the Moodle discussion forums with Mattermost channels.

My cohort was small (14 students) and the facilitation class part of a larger Masters program in Learning & Technology, many with years of experience in teaching & learning roles, so I would classify the learners as tech savvy educators, which is one of the reasons I felt ok experimenting with a new technology.

That said, I wanted to be explicit with the students that Mattermost was an experiment, and provided some extra support to walk them through the account creation process, including a Q&A session specifically about using Mattermost in a course introductory synchronous session. Along the way I was able to contribute back to the OpenETC some how-to documentation that I put together for my students that others can use in the future if they would like to do something similar. Living out the “contributions, not contracts” piece of our OpenETC philosophy.

In my week 1 course activities, I asked students to create their Mattermost accounts, update their user profile to add a photo or avatar, and post an introductory message tagging me to let me know they were in. These were fairly low stakes activities that would help to get them comfortable in the environment. As they entered the space and posted their welcome message, I made a point to greet them personally. I also pinned some general guidelines to the top of the Town Square channel that spelled out some general expectations for the space.

Welcome to the LRNT 528 Mattermost chat group. This will form the discussion hub of the course. There are channels set up for each of the scheduled discussions we will have in the course (CoI, TEK-VARIETY, and Final Reflection), plus this main channel called Town Square where you can post general questions.

Some guidelines for posts.

  1. Keep your discussion posts to a single point. This will help keep your posts short.
  2. If you find your posts are getting long (over 150 words), then you likely have a lot to say about the topic we’re discussing. In that case, consider writing a blog post on your blog and then paste the link to your blog post here.
  3. Feel free to use memes, gif’s, and emoticons. These are legitimate forms of communication. That said, don’t needlessly use them, or use them as a replacement for genuine discussion.
  4. Don’t feel the need to academically cite content in your posts, but do include links to external content that is relevant to the discussion.

Above all, read carefully, reflect before sharing, challenge tactfully, question thoughtfully, forgive mistakes (yours and theirs), and have fun learning.

I structured the Mattermost discussion area to include a general Town Square channel, two channels, one for each of the facilitated discussion I wanted to have in the course, and a Final Reflection channel where we could debrief the course at the end. I also created a Sandbox channel where students could post and experiment, but it turned out not to be needed as much of the experimentation with the platform happened in the general Town Square channel.

What worked well (my perspective)

Overall, the technology worked well. The conversations did seem more spontaneous, yet still well thought out. I did see an increase in the use of emoticons and gifs, and quite often could see conversations unfolding in real time by students who happened to be on the platform at the same time discussing course content.

Being able to have students tag others in a conversation is also a nice feature common in many IM platforms and not in the Moodle forums. You can @name someone to draw attention to a post, or bring someone directly into a discussion as opposed to if someone name drops you in a discussion forum post. I am beginning to think of the @name feature also as a form of attribution as I saw it used in ways to tag other learners who made a good point or that someone wanted to build on. And by using @all I was able to message all learners and draw attention to something – a salient point made by a learner, or to provide some further clarification.

Students also used the direct messaging feature of Mattermost to communicate with me which I appreciated as all course related conversations were now central within Mattemrost and not in my email account.

This was a digital facilitation course and students do an experiential learning assignment where they become the facilitators, designing a week of facilitation for other learners in the course and I was pleasantly surprised at how many of them decided to use Mattermost themselves for their own facilitation weeks, which said to me that they were feeling comfortable enough in the space to use it on their own.

What didn’t work well (my perspective)

On the downside, threaded conversations are a not quite as straightforward as in a discussion forum and sometimes it could feel overwhelming to figure out how to view and respond to specific threaded discussions. But once learners figured out how to use the threading feature it seemed to lessen the cognitive load of being presented with a wall of seemingly unstructured conversation when entering a channel.

What the students thought

I ran a short informal survey with the students at the end of the term to get their feedback on using Mattermost as compared to Moodle discussion forums, and overall they were quite happy with the tool comparing it favorably to Slack in terms of functionality. But more importantly, there was overwhelming consensus from the learners that Mattermost did change the way they participated in the class, and the technology did make them feel more engaged with both the course material and their classmates.

I can’t release the details of the feedback as it was an informal summary of students that was meant just for my own information. But the results were promising enough, and gave me enough information to show that there is something about the way that tools like Slack and Mattermost works that changes the way students participate and engage. I am planning on using Mattermost again this fall with a larger cohort and am going to pitch doing a SOTL focused piece of research on using it, this time with ethics approval so I can publish some findings next year.