Just do ink

App icon from the app store

App icon from the app store

I took the challenge of finding a new tool to work with this week and I choose to work with DoInk App. Although I knew there were green screen apps available, I didn’t know specific names for any apps. Thanks to Rochelle I was introduced to DoInk. It is available in the app store and can be purchased for $3.99 which I feel is a fair price for what you get. I downloaded the app on my iPhone 6 and feel as though it might have been nicer to work with on an iPad just because the screen is a little bigger to work with. The DoInk website has a lot of

Our green screen made from wrapping paper

Our green screen made from wrapping paper

I decided to jump right in before reading a lot about using the app or viewing the tutorials on the site. Before I could do anything I needed to create a green screen. There are many ways you can make a green screen and most are pretty affordable. I purchased a plastic table cloth from the dollar store to use but it didn’t work out the way I had hoped. The green wasn’t dark enough so our wall color was showing through enough that the app was picking up our grey/blue color. When it picked up the paint color the image that I selected for our green screen background would be very light and almost fuzzy. I doubled up the table cloth but it still didn’t do the trick. On a second trip to the dollar store I bought some bright green wrapping paper. This worked awesome! My only complaint would be the faint green outline that appears around the objects or person in front of the screen. But I suppose that’s what you get for a $3.99 app.

In terms of using the app I would say it is pretty user-friendly. I was able to figure out how to use it without watching the tutorials or reading the user guide that they take you through when you start up the app. I probably could have saved myself some time had I actually read or watched the app, but I’m all about experiential learning. After spending some time exploring their site I noticed all of the tutorials they have as well as some great tips that could be used for making your videos using the green screen.

To create the video I found an image on a creative commons search and took a screen shot of it so it was on my phone. I edited the photo so that it filled the screen and uploaded that to DoInk to use it as the background. My son used some of his Star Wars toys to play and make a mini movie scene. I imported the video into iMovie and added the audio as well as the rolling font. You can add text and draw on the video using DoInk but I wanted to add the audio and credits from iMovie.

From a teaching perspective, I don’t know that I would use this app a whole lot. It would be possible to create some fun video lessons, but I don’t see it as being very practical because it takes time to make and I’m not sure how good it would be at getting content or skills across to the students. I see this as being used for student projects. I think it would be a really fun way for students to present information, maybe create a newscast or make a trip around the world describing the different images being shown while dressed in character. Having students create videos of their own would fall into the constructivist and connectivist learning theories according to Bates. Bates also provides some criteria to consider when selecting videos to use:

  • it is short and to the point;
  • it is relevant to what you want to teach;
  • it demonstrates clearly a particular topic or subject and links it to what the student is intended to learn;
  • the example is well produced (clear camera work, good presenter, clear audio);
  • it provides something that you could not do easily yourself;
  • it is freely available for non-commercial use.

If you are making your own video lesson you would want to keep these tips in mind. Short and to the point is sometimes impossible depending on the skill or topic you are trying to teach. If you cannot keep it short and to the point it might be helpful to break up the video with some humor or integrated videos/images.

Have you ever created a video lesson? How did it go? What did your students think about it? And how did you create it? I’d love to hear from you.

I’m not an artist, but I could sure use this Canvas

After spending some time exploring different LMS this week, our group has decided to go with Canvas. Both Nancy and I have extensive use with Edmodo and after exploring Google Classroom last week we realized that it’s very similar so we didn’t want to go with that option. Andrew suggested we explore Canvas a little further and after some exploration, we decided to go that route. If you have never used Canvas, Andrew created a video demonstrating how to navigate the dashboard in order to set up you class. I’ve included that video below and you can read more about his thoughts on Canvas on his latest post.

When I started exploring Canvas I found that it was pretty user-friendly. I appreciated the classroom set-up checklist that was included when you start a class.  This takes you through setting up a class step by step. I found it to be really helpful and easy to follow. However, there were a lot of features that weren’t discussed in the guide that I missed out on the first time I explored Canvas. After reading Kyle’s blog it was brought to my attention that outcomes can be attached to the assignments or lessons you are adding to the class. I didn’t know that it was an option prior to reading his blog. After reading that I decided to look into the outcomes option to see how it works. Unfortunately, the outcomes that are already uploaded are American based so I would have to enter my outcomes on my own (which isn’t a big surprise, but would be nice to have the outcomes already loaded to select from).

This brief product video also taught me a few things, one of which is the ability to connect apps with the classroom you have created. I use Khan Academy to teach coding to my students in some of my technology courses so being able to connect that content to this platform is great. It eliminates the need for students to go to multiple sites in order to take part in the class which makes things a little more user-friendly for the students (and myself).

Canvas has a Commons area in which you can share courses as well as use courses that have been developed by other people. It is basically a digital library create by the users on Canvas. The courses seem to be built around standards and themes from the United States, but that is probably because those are the majority of the people sharing their work. It would be nice to see some more courses being added from people in Canada, more specifically Saskatchewan. Hopefully after this class we have a few courses that can be shared on Canvas. Another thing I noticed about the Commons area is that a lot of the courses are partially finished having only a few assignments or modules. The Commons area provides access to courses, modules, assignments, documents, quizzes and a variety of other resources.

Screenshot of the Commons Area

Screenshot of the Commons Area

I should also mention that I was shocked (in a good way) to receive a phone call on Friday at work from Matt at Canvas. He was simply calling to check in and see how my initial experience was and wanted to help answer any questions I may have had. He was able to answer the one question I did have at that time which was whether or not the student and teacher dashboards looked the same. He told me they look almost identical minus some menus that the teacher has to edit the course that the students don’t have. The reason they have it set up this way is so that there is little confusion going from one to the other. It makes it easier for teachers to help students if they need help navigating their course. I really like that it looks the same for teachers as it does for students. I was impressed that they took the time out of their day to call me and make sure everything was going well so far. I feel confident that if I have any questions help is only a call or a click away.

After reading Kyle, Logan and Liz’s blogs this week it is clear that I still have a lot to learn with Canvas. I’m looking forward to using this with my group to develop our course. I think it will be a great LMS for our project.

Developing an Online Digital Citizenship Course – The Beginning

Photo Credit: drpretty Flickr via Compfight cc

Photo Credit: drpretty Flickr via Compfight cc

I have to start by saying that I am pretty excited about the major project for this semester. We have been asked to create an online course consisting of different lessons, activities and assessments. This is something that has appealed to me since the start of my master’s program and I am hoping that I have some opportunities in the future to be a part of developing online content for our schools.

I am fortunate enough to work with two amazing teachers (Andrew and Nancy) who are going to join me in creating our first online course. We have decided to work with the Digital Citizenship Continuum  from within the Digital Citizenship Education in Saskatchewan Schools document developed by our very own “Courobrandt” duo of Alec Couros and Katia Hildebrandt. The continuum focuses on digital citizenship and involves competencies for Kindergarten right through to Grade 12. The guide was developed to help K-12 teachers integrate digital citizenship instruction in the classroom. I highly recommend reading through the document, if you don’t want to read all of it, at least check out the competencies starting on page 56. If you are not familiar with digital citizenship, check out this brief video.

The competencies have used the concepts of Ribble’s nine elements of digital citizenship and include three broad categories:

  1. Respect – digital etiquette, digital access and digital law
  2. Educate – digital communication, digital literacy, digital commerce
  3. Protect – digital rights and responsibilities, digital safety and security, digital health and wellness

There are nine competencies and we will be fully developing lessons, activities and assessments for three of them for Grade 9-12. Each group member will choose one competency to develop based on personal interest.

Photo Credit: hitchinssamson Flickr via Compfight cc

Photo Credit: hitchinssamson Flickr via Compfight cc

The competency that caught my attention right away was Digital Health and Wellness: The physical and psychological well-being related to digital technology use. This focuses on developing an understanding that using technology inappropriately can hurt us both physically and emotionally. Physically through something like texting and driving, emotionally through overuse and addiction to technology.

The course that we will develop is cross-curricular and can fit into many other courses such as ELA, Information Processing, Psych, Social Studies and Health. Given the nature of the content it will be very relevant for all students because technology plays such a large role in their daily lives.

We haven’t thought a lot about the way we will assess and the tools we will use, but we have discussed using blogs, a wikispace (or other website), assessments using Socrative or Google Forms, Google Docs and presentation tools such as Powtoon and screencasts.

I know I haven’t given you that much information, but what are your initial thoughts about this course? Any suggestions that you have for myself or my group? At this point we don’t have a super clear vision of what it will look like but I feel like we have a pretty good start. Share your thoughts in the comments below.

Photo Credit: Leo Reynolds via Compfight cc

Photo Credit: Leo Reynolds via Compfight cc

 

The end is in sight…

Well the time has finally come…I’m taking my last class this semester and I’m looking forward to completing my degree. I’m happy to be taking another class with Alec & Katia and the class members who make the learning experience so valuable. It’s such an awesome community and I can’t wait to get this semester going.

I am a high school teacher at Regina Huda School specializing in business, technology and math. I have been teaching there since 2010 and I don’t have any plans to move anytime soon. I am a wife and a mom to two crazy, but amazing kids. I have a 3.5 year old boy and an 18 month old girl. They keep me busy but I know that they will be grown up and moved out before I know it so I try to soak up all the time I can with them and just enjoy the moment.

My family June 2016

My family June 2016

In my spare time (as if I have any right now)…but when I do have spare time, I enjoy keeping active by playing soccer, hockey and golf. I also like to run, but don’t enjoy running indoors so it seems to be a seasonal thing for me. I also love all things Disney and it is one of our favourite places to travel to.

My learning goals for this semester are to get a feel for developing an online course as this is something that I would be interested in doing in the future for the ministry. I am also hoping to do a better job connecting with my peers through twitter and blogging by checking in a little more often than I have in past semesters (it’s much easier said than done with a hectic schedule). I am hoping to learn about some new tools that I can use to create online courses so I can make use of them this semester and moving forward.

12 Weeks of EdTech – A Summary of Learning

Given that it’s the holiday season I thought I would have some fun and attempt to do a cover of a Christmas song for my summary of learning. I have done three summary of learnings before so I wanted to do something different and haven’t yet attempted a song so I thought why not this semester? I have to apologize as singing is not something that comes natural to me, nor is it something I do well. The background music didn’t turn out the way I wanted it to either. The music is quiet low and sounds a little echoey, but I honestly tried to record three different ways MULTIPLE times and this is the best quality I could come up with. If it’s too painful to watch feel free to skip through to the last 15 seconds where the 12 weeks counts down. Please also keep in mind that what EdTech taught me each week is not done in chronological order for obvious reasons. It was too difficult to make it all go in order and make sense, but regardless of the order I hope you enjoy my little song (singing aside).

In case you missed all the lyrics for each week, here it is:

Twelve weeks with you guys
Eleven ways to connect
Ten awesome blog posts
Nine classroom tools
Eight tools to assess
Seven grand presentations
Six assistive tech tools
Five classes with Alec
Four learning theories
Three types of web
Two different realities
And a collaborative experience online

Obviously I learned a lot more than just the list of items that I gave you in my song. I want to discuss some more of what I learned this semester since the song just doesn’t do it justice. We covered a lot of topics and had some awesome presentations this semester. There was some overlap in the topics which made it seem a little less overwhelming and easy to see how a lot of Ed Tech topics relate to one another. Here is a summary of a few main ideas from this semester.

Learning Theories
Technology allows us to use four different learning theories: behaviourism, cognitivism, constructivism and connectivism. Although each theory can be used, most technology lends itself to constructivism and connectivism the most. Using different websites and apps such as web quests or genius hours lend itself nicely to the constructivist approach in which students are building on knowledge and making connections between what they are learning and the real world.

Blogging, and Skype are excellent ways to connect your students to others outside of the classroom and learn through the connectivism approach. Whatever learning theory is being applied we must always think of our students. Behaviourism and cognitivism are more teacher directed, one-way learning and connectivism and constructivism allow the students to build knowledge and direct their own learning. When choosing which technology you want to use, be sure to think about the learning theory involved and how that will impact the learning of the student.

Tool Selection
The tools that we use greatly impact how students learn, how we teach, what we teach and how we assess. Before we decide which tool to use we must always think about the message that is being sent through the medium we are using. What type of learnings are benefiting from the tools we are using? Which type of learners are falling behind? We also need to consider what the purpose of the tool is. Are we using each tool for it’s intended purpose? Are we going beyond the simple cognitive or behaviouristic learning methods?

Technology also allows us many opportunities to assess our students learning but how can we ensure that our assessments are valid? Many assessment tools offer multiple choice or true/false questions. The issue with these types of questions is that they are usually surface level questions and don’t question deeper understanding. Students are also able to guess with some of these questions. Does guessing really show us what the students have learned? It is crucial that we are evaluating the tools and consider the message that is being sent using the tools that we are selecting. We must always be questioning and evaluating the purpose of the tool. This is a great article to read if you need guidance for integrating technology effectively.

Both teachers and students (but especially teachers) have to know how to seamlessly integrate technology into teaching and learning

Advantaged Vs Disadvantaged Students
In all of our presentations we discussed who is advantaged and disadvantaged when we use technology. This is an interesting concept to think about because it boils down to the perspective you are looking at it from. If we start by looking at socio-economic status (SES) it is clear that a divide exists between those who can afford technology and those who cannot. We need to work at bridging this gap and allow those who are disadvantaged to have the same opportunities within our classrooms. Perhaps if students do not have devices to work on at home they get priority over those who do when using technology in the classroom. Whatever the scenario it is important to attempt to level the playing field in regards to access to technology.

Another perspective we need to think about is those who are at a disadvantage because of a disability whether it be physical, emotional or mental. For some of these students assistive technology can greatly impact their learning and make things more equitable for them. We must ensure that other students and parents do not think that the student using assistive technology is being given the upper hand. The reality is that if they didn’t need the tool, they wouldn’t use it. There are stereotypes and labels that are associated with students who use assistive technology. Often times students who use these devices feel as though they are singled out and “different” because they need additional support from the tool. We need to work towards eliminating these stereotypes and labels.

 

 

 

Is this the real life? Is this just fantasy? Actually, it’s Virtual Reality.

The first few lyrics to Queen’s Bohemian Rhapsody is all I can think of when I think about virtual reality (VR). If you aren’t already familiar with virtual reality, to put it simply it’s a type of technology that allows you to experience another environment through sight. This happens by using a headset that tracks your head and eye movements to change the image you are seeing within the headset changing the environment you are experience. Our brains are triggered through the image and movement to make the experience more lifelike. Why might someone use virtual reality? There are a variety of reasons for using VR that go beyond simply entertaining ourselves. There are 9 different industries that use VR for training, education or experiences. Sharon discusses some VR tools that Sask Polytechnic use here in Regina to train their nurses. VR is being used to help treat patients with dementia and for teaching someone how to walk again. For an overview of virtual reality and how it works check out this video.

Amy found a really great Ted Talk discussing how virtual reality should be used to develop empathy through experiencing the lives of others around the world. I cannot even fathom what it would be like to walk a mile in someone else’s shoes in a war torn country or a country where children must walk miles to get to school. Yes I have seen videos or documentaries, but those videos do not give me the same experience that VR could. I had never thought of using VR in this way before and I think that this would is an incredible way to use the technology.

Photo Credit: bmward_2000 Flickr via Compfight cc

Photo Credit: bmward_2000 Flickr via Compfight cc

Augmented Reality (AR) is another type of reality that can be experienced using technology. This is when we experience reality by combining the real world with overlaying information. Some forms of AR I am familiar with are found while watching TSN or other sports on TV. The first down line on an football field is an augmented reality, it can be argued that slow motion is a form of augmented reality as well because it helps us examine a clip more closely to see what happened. Charles Arthur provides a thorough description of AR by discussing the development, AR apps and the future of advertising using AR. Bill and Logan introduced us to Aurasma which is an AR app that has so many uses within the classroom. Rochelle described how she uses Aurasma at her school by having students create book reviews for the books in the library. A book review is just one example of the many ways AR can be used in education and within our classrooms.

Of course we can’t forget about the digital divide when we think about integrating these experiences in our classrooms. We must always remember that all students come from different socio-economic backgrounds and that the access to technology among them might vary. The cost to implement VR technology in our classes can also be very expensive (unless we use Google Cardboard which is reasonably priced).

I can definitely see myself using a word wall for my math courses and integrating some of the virtual experiences into my technology class. I am really interested in Google Expeditions and want to find a way to integrate that into my technology class. This might be something that I could collaborate with another teacher to make it a cross-curricular activity mixing technology with social or science class. I was happy to hear so many of you already have experience with these different realities and I love hearing how you integrate them into your classes. If there is anything you are doing that uses these technologies I’d love to hear in the comments below!

Assistive Technology Doesn’t Just Involve Technology

Photo Credit: woodleywonderworks Flickr via Compfight cc

Photo Credit: woodleywonderworks Flickr via Compfight cc

I was a little apprehensive about having to write this post discussing the topic of assistive technology. I wasn’t sure that I would have a lot to say because I didn’t think I had a lot of experience with using assistive technology but after reading a few of my classmates blogs this week I was able to think about assistive technology from a new perspective. I teach at the same school as Andrew so my experience is much the same in the fact that I don’t have the variety of students that many other teachers have. I have had very few students with disabilities that need adaptations however there have been instances in which I have had to make adaptations. In my internship I had a student who was unable to read from anything printed on white paper so I had to print everything for them on yellow or green paper.  Another way that I have accommodated a student with a disability is by chunking their work. This involves breaking a big assignment down into manageable pieces for them so they don’t get overwhelmed and fail to finish the assignment.

I didn’t think that any of these adaptations could fall under assistive technology until I read Amy and Heidi’s blogs this week. Each blog discusses ways that we adapt that might not involve technology. If you check out the Understood website there is a large list of assistive technologies that don’t actually involve technology. After reading through some of the items in the list I realize that I do a lot more adapting than I had originally thought. In my math classes, students use calculators, graph paper, rulers, protractors and manipulatives. These are all assistive technologies. Other examples include chair cushions, fidgets, spell-check, timers and graphic organizers.

Dave Eayburn describes assistive technology as: “any item, piece of equipment, or product system, whether acquired commercially off the shelf, modified or customized, that is used to increase, maintain, or improve the functional capabilities of a child with a disability”. I feel like it’s a pretty good definition of assistive technology but I do think it assistive technology can help everyone, not just those with disabilities.

Assistive technologies (or ATs) are specialized technology (software and/or hardware) that are used by people with and without disabilities to adapt how specific tasks can be performed.

Photo Credit: DiegoMolano Flickr via Compfight cc

Photo Credit: DiegoMolano Flickr via Compfight cc

I think that assistive technologies go beyond hardware and software and include any object or device that allows us to be more efficient or productive. We all use assistive technology everyday; computers, phones, word processors, Siri, microwaves and cars are just some examples of the daily items we use that assist us. Obviously there are some devices (hearing aids, braile, sensory objects to name a few) that are more helpful to those who have disabilities and which impact these individuals more in their daily life than my everyday life. For example, could I get by without a computer? Sure I could, but my work life would be a lot less productive. I appreciate having the technology to use but if the computer was never invented I wouldn’t know any different and I would be able to carry out my job no problem. However, someone who is blind and never learns to read braile will have significant issues reading and learning.

Google Read and Write was discussed a lot this past week and it was interesting to read teachers discuss their experience using it in their classrooms. Roxanne is able to integrate it into her daily language lessons and I think that it is a great tool to adapt for those who struggle, but is also a great tool for students who may not necessarily need the tool. There are a variety of features and two of them that I thought were really great were the vocabulary list and the word predictor. The word predictor is great for students who may be learning English or who struggle with reading.

I haven’t had any experience with the add on, but after watching this video there are a few suggestions that I have. The first is that when the picture dictionary is used it would be nice to have real, lifelike pictures to choose from as opposed to simple cartoons/clip art. My second suggestion isn’t just for Google Read and Write, but for all Text-To-Speech (TTS) software. It would be nice if the audio didn’t sound so robotic. Is it too much to ask to have it sound more like an audiobook that is read by a real person? Now I know that it isn’t as easy to develop software that can do that but my hope is that sometime in the future we get there. I can’t imagine having to use TTS often and having to listen to Mr. Roboto talk to me. If you don’t know what I’m talking about, here is a sample from the article we were asked to read this week. It had a listen option so I decided to click it to see how it sounds. Let’s just say I didn’t listen to the whole file and can’t imagine having no option but to listen to it.

One final thought is based on a recommendation from the article Rethinking Assistive TechnologyThe article has seven recommendations for rethinking assistive technology and the one that stood out to me the most was that we should consider using “technology enhanced performance” as a replacement for the term “assistive technology”. The reason I like this so much is because it breaks down the barriers and stigmas that might be associated with students who use the assistive technology. The adaptations shouldn’t be something that makes users feel singled out or different and changing the name of it might help break down those barriers a bit.

What are your thoughts? How do you adapt for your students? Do your adaptations always involve technology or are some of the adaptations less sophisticated? Have you had any experience with TTS software and did it involve a Mr.Roboto? Do you think TTS software will ever sound ‘human’?