Mrs. K Science: The Blog

This is a place where I'll post my musings about technology in the classroom, lessons I've used successfully, and other Ed Tech stuff I find/use...

Jigsaw 103: Making the Grade

posted Oct 6, 2017, 7:02 AM by Emily Kroutil

This session focused on Panes 3 and 4.  Pane 3 shows a map by default, but can be used for a variety of activities.  The one I've used most is the Notes function.  I like to go over what I call "Housekeeping Information" at the beginning of my synchronous sessions.  I discuss the upcoming due date and what is due for that due date and then the upcoming week until the next due date.  Sometimes I provide links to important resources for them to check out over the week, even though the links aren't live in the recording.  The first time I tried it, I forgot to share the notes, which ended up in a recording that looked like this:
My one student in attendance was too shy to ask why she wasn't seeing anything on the screen and I didn't know to ask if she was seeing what I was writing.  I only found out when I viewed the recording.  I was able to figure out what I'd done wrong via Yammer and the next time I tried the notes function, it worked better:
The text showed up, but even on "Follow Me", it was VERY small.  There is no way to make the text larger at this time, so I've actually stopped using the Notes function for this "Housekeeping" part of the synchronous session and have transitioned to using the whiteboard, which is Pane 4 in Jigsaw:
This way, I can make sure that the text is large enough to read, even on the recording.  As I'm writing, I discuss important parts of the items I'm writing down, such as, "this test is scheduled the same day as the due date, so make sure you are following the schedule so you don't get behind and you have time to take the test and get full points for it!"  If I want to add links, I post them in the chat box.  They still aren't live in the recording, but students can still see them and could type them into their browser, since the links are usually fairly short.  The whiteboard has proven better for listing these "Housekeeping" items than the Notes function.  

I haven't yet tried a poll because when I do have students in attendance, its usually only one student, so I just verbally ask them how its going or what they have questions with or what they understand or don't understand from what I just discussed.

I also annotate quite a bit with my classes, but I tend to use Pane 2 for that.  For me, it's easier to upload a PowerPoint presentation with a physics problem already written out on it and annotate that way.  Those end up looking something like this:
As far as best practices go, I've found the following to be important to keep in mind when utilizing these panes:
  • If you have students in attendance, always ASK them if they can see what you're writing/talking about.  Just because you can see something DOES NOT mean students can see it.
  • The Notes function uses VERY small type and the recordings are often made at a lower quality than your screen resolution, so this, combined with the small type, can make the Notes difficult to read during recordings.
  • The whiteboard has a variety of tools.  Try these out before you use them with students.
  • I like using different colors for different parts of problems I'm working out.  I think this makes it easier for students to follow and when you're done, you don't have a huge wall of similarly-colored text.
And here's my attendance check:

Jigsaw 102: Going to Class

posted Sep 7, 2017, 5:11 PM by Emily Kroutil   [ updated Oct 5, 2017, 3:55 PM ]

I've used Pane 1 a few times in my courses.  Most frequently, I've used Pane 1 as a "fun" way to introduce a concept discussed in that session.  For the following video, I was discussing velocity (start at 3:15 to see the video):

Math of Physics Review

This video describes the difference between speed (scalar) and velocity (vector) and is useful the week this session was recorded, but also in a future module when we discuss vectors, something students tend to have difficulty with (you'll see the next video also discusses vectors...did I mention students struggle with having a magnitude AND a direction??).

The next video I used with my students had a clip from Despicable Me, but is entirely related to our content (I swear!).  I thought it might get the students laughing, maybe engage them a little bit, and then I could bring it back to physics after the clip (start at 3:18 to see my intro and the video and stopping at 6:15 will let you see my explanation after the video):

2D Motion Review

After the video, I brought it back to physics by reminding the students that vectors MUST have a magnitude AND direction, an important concept from the module we were currently studying.

I've also used it as a backup in my course as well.  I will pre-load instructional videos of myself solving the problems I hope to solve during the live session.  Then if my slate is not working, I can show an instructional video that shows me solving the problem.  I prefer to solve the problem live (I think it is more dynamic that way), but having the videos preloaded at least lets me know I've got a back up in case my preferred method doesn't work.  You can see in the screenshot below that I put the title of the video and the time within the video, so if I needed to use it, I'd be able to go right where I needed to be in the video without losing too much time during the session:
So the video that had this problem solved was called "Finding the Resultant" and I solved the problem starting at 3:10 and continued solving it until 9:31.  Here's what it looks like with the back up videos loaded in Pane 1:

And here's my attendance check:

Jigsaw 101: Setting Up For Success

posted Aug 11, 2017, 7:37 AM by Emily Kroutil   [ updated Aug 11, 2017, 7:45 AM ]

This year, GAVS is moving from Adobe Connect to Jigsaw for our synchronous sessions.  This is proving to have quite the learning curve for many folks, myself included.

One of the first things I had to do to prepare for my first Jigsaw session, was upload assets to my asset library.  My first session was a welcome session, so I wanted to upload my syllabus as a PDF (attachment) and I also wanted to upload a Welcome PowerPoint (Presentation).  Then, because screen sharing wasn't working very well, I wanted to upload a screencast video (video) I'd created to give my students a tour of the course homepage.  I gave all of these documents the tag of "Fall 2017" so I could easily find all of the things I'd uploaded specifically for Fall 2017.  I also used the tags: syllabus, welcome, 18 week, 16 week, 14 week, 12 week, powerpoint where applicable.  The screenshot below shows me searching for the "Fall 2017" tag, which gave me these "Fall 2017" specific documents.

The following YouTube video is the recording of my Welcome Session.  We've found that students sometimes leave the video links open, which then locks other students out of the recording.  I didn't want to deal with fielding emails from parents when their kiddos couldn't access the recording and also try and get ahold of all of my students and convince them to *pretty please* close the window when they are done watching a recording.  So, I downloaded my recording and uploaded it into iMovie.  In iMovie, I cropped out the student names so as not to violate FERPA.  Then, I uploaded the cropped video to YouTube and posted the link to the YouTube video in my course.  Some things to be aware of in this video:
  • I used my "Fall 2017 Welcome Session" Presentation to go over important policies and procedures with my students (0:27 - 34:00).
  • I uploaded all 4 of my "Fall 2017 Syllabus" attachments so my students could download their appropriate syllabus (you won't be able to see this because I had to crop out the entire right side to remove student names).
  • Towards the end, I played my "Fall 2017 Welcome Tour" video for my students (fast forward to 35:58).

Fall 2017 Welcome Session

I think the most important thing for success in a Jigsaw session is to do a few things:
  • Be prepared.  It is important to kind of plan out in your head how you'd like to structure the session, especially since screen sharing isn't working (or isn't working well).  So I can't decide spur-of-the-moment, that I want to share my screen and walk my students through something.  For the welcome session, I knew I needed to create a screencast before my session of me walking the students through their course homepage.  This meant that the day before I recorded the screencast, edited it in iMovie, and uploaded it to YouTube, so I could easily add it as an asset in Jigsaw.  If I wasn't thinking in advance about what I wanted to do for my session, I wouldn't have had time to pre-record a video to play during the session.  Because you have to upload assets in advance, you need to make sure that you are planning your session in advance, so you can pre-load your assets into Jigsaw.  There is no spur-of-the-moment deciding that you'd like to add something else to your presentation.
  • Keep it simple.  Many folks were so-called experts at Adobe.  They had figured out how to play games with their students, upload movies and music to play at the beginning, etc. etc.  Luckily (I guess...), I was still fairly new to Adobe since I started in Spring 2017.  So, I was already keeping my sessions fairly simple because 1) I didn't want to get too ahead of myself 2) I was having trouble getting students to attend, so I couldn't do a ton of interactive things with them because there was no one to interact with.  With Jigsaw, I toned it back even further.  I knew that I wasn't terribly familiar with the platform and that the students weren't either.  So I kept it very simple - presentation and a video.  I also prefaced the session by telling the students that it was a new platform for everyone, so we would just take it slow and try our best.  If something happened, we would try our best to troubleshoot whatever problem we were having.
  • Be flexible.  I knew that screen sharing didn't work, so I pre-recorded a video.  But, in case the video didn't work, I had screenshots of important parts of the homepage in my presentation.  
And, just in's the Attendance checker from this session, just to show that I watched the recording:

Evaluate: Self-Reflection on Teaching Abilities Quest

posted May 25, 2016, 9:34 AM by Emily Kroutil   [ updated May 25, 2016, 9:34 AM ]

For this last quest, I have created a Google Drive folder with comments from other educators, students, my TKES evaluations and self-reflections, and samples of PD I have participated in:

Self-Reflection on Teaching Abilities Quest

I find it difficult to talk about myself in this sort of way, but on this last day at Savannah Arts (Truly, it is my last day.  I could have never planned to reflect on my teaching career on my last day at my favorite school...), I have to believe that I made a difference in my students' lives.  I worked very, very hard to create a learning environment where my students could succeed and feel confident about their science abilities.  We get a lot of kids in Physics that have just come from Chemistry and that struggled quite a bit with Chemistry.  They come to me saying things like, "I can't do conversions" or "I'm bad at science" and I work very hard to praise them and show them that they CAN be successful at science.  Physics, no less!  In my AP class, I work very hard to teach them about environmental science and also how to be responsible citizens and voters.  I try to encourage them and build them up as scientists.  I am often told that AP Environmental Science (APES) was their favorite class and that they learned so much in that class and talk about what we learned in APES with their friends and family.  This is perhaps the greatest compliment I can get from a student.  I love when they see how the things they learn in my classroom are applicable in their life outside the school building.

I think I am quite adept with technology and I have made great strides flipping my classroom when no one else in my department or school was doing it.  I researched the various LMS, spoke with my Technology Specialist, created videos, guided notes, learning objectives (competencies) based on the standards and so much more because I believed that it was a superior method to what I was doing before and my students excelled using this method.  I feel like this is not quite virtual school, but it is a blended method of learning and I know that I could make the transition to teaching virtual school successfully.  

My weakness as an online educator is probably that I haven't ever taught a fully online class.  However, I know that you have to start somewhere and if you ask anyone who has worked with me, they will tell you that I work incredibly hard and I do what needs to be done for my students to succeed.  

Evaluate: Differentiation Quest

posted May 25, 2016, 8:37 AM by Emily Kroutil   [ updated May 25, 2016, 8:39 AM ]

For this quest, I looked at data from my Projectiles unit to maintain continuity with the Create Skill.  First, I looked at the formative assessment quiz for the first 4 learning objectives (competencies).  These objectives basically deal with breaking vectors into components, calculating the components, and the angle.  It is prep work for the actual "meat" of the unit, solving horizontal and vertical projectile problems.  The class scores for this quiz were as follows:
For this quiz and these objectives, I was actually really happy with how my students performed.  A few struggled, but I was able to work with them individually to address these weaknesses.  This tells me that my students have a pretty good grasp on trig and the prep concepts needed to solve projectile problems.  This also tells me that they are pretty good at finding what they need in the word problems and, if given the formula, can manipulate it correctly to get the answer.  So, I continued to the next set of objectives (5-6), which are solving horizontal and angled projectile problems, respectively:
So far so good.  Then I get to the next two horizontal projectile questions:
Ok, stop!  Something here is drastically different.  We are down to between 41-58% of students getting these questions correct.  For #4, there isn't one wrong answer that is particularly popular, so this tells me they were guessing and really had no idea how to solve this problem.  The students historically suffer with a problem like this.  When they are given a height and distance and no initial velocity, they struggle.  This tells me that we need to practice these types of questions more (and to be fair, these are the harder problems and when I haven't looked at this material for awhile, I struggle with where to start too).  We do a problem like this in their lab, but since they work in groups, the math work that is turned in doesn't always reflect their knowledge because someone who DOES understand the concept, tends to help them with it (lets them copy their work without showing them how to do it).  I call this "not teaching them how to fish."  I have no problems with them helping each other, but my requirement is that the person doing the helping "teaches them how to fish and doesn't simply give them a fish".  They don't always follow this rule, however (they are teenagers!).  

Now, lets move onto problem #5.  Less than half of them know how to do this one and there is an INCORRECT answer that was chosen almost as often as the correct answer.  This tells me that they at least know how to start the problem and got stuck or lost somewhere along the way.  They chose the distractor quite frequently, which tells me that they are making a common mistake here.  This could be remediated individually when I go over the quiz, an instructional video could be created discussing this particular type of problem (could be helpful with #4 too!) or, in the case of virtual school, we should talk about this in our weekly synchronous session and maybe even have a discussion about this type of problem.  They are making a common mistake and this is easily remedied.  Much easier than #4, where almost half of the students have no clue where to begin.  It could also be useful to pair a student that got #5 correct with a student that got #5 incorrect (especially since its almost exactly a 50/50 split) and have the person who got the problem correct watch over or look over the work of the student who got it incorrect and "teach them how to fish".  

Now, let's look at some individual quizzes.  I've eliminated their names to comply with FERPA, so they will simply be referred to as Student A and Student B.  Student A's quiz is below:
Student A obviously needs a little more work on this section.  She got question #1 correct, which tells me she knows the first step of a horizontal projectile problem (solve for time!), but then gets stuck after.  She also got #8 correct, which didn't require any math, but was actually a concept question.  So, this tells me all is not lost with her.  I know that she's at least looked at the material, but maybe didn't spend as much time on it as needed (I try and tell them over and over that this unit is the most difficult and will take more work than any other, but they're teenagers, so of course they know better than me).

Now let's look at Student B (We will come back to Student A):
I swear, I just picked two students randomly, but these two would actually make excellent partners for reviewing this section before they took the alternate version of the quiz.  Student B missed one problem and it just happens to be a problem Student A got correct! (Seriously, I couldn't have planned that if I tried).  They would make excellent partners because Student B could help Student A with the bulk of the problems that were missed, but Student A wouldn't feel like she was getting tutored by a friend because she could help Student B with the problem she missed!  I would also direct Student A to the instructional video where I explain how to start (the part she's excellent at!) and solve each step of a horizontal projectile problem.  It would probably be worthwhile for Student A to watch that video a couple times while she CONCURRENTLY is solving a problem so she gets practice with the various steps.  Then, once she's solved a couple with the video, she could try working on them by herself and ask Student B for help if needed.  Of course, I'm always available, but sometimes they like learning from their friends rather than me.

I'm not sure if an LMS can do this automatically or if I could set up paths like, 
  • If you score greater than an 80%, you may move on to the next section.
  • If you score between a 50-79%, watch this instructional video and do such and such before attempting the alternate version of the quiz
  • If you miss question #5, view this resource and do such and such before attempting the alternate version of the quiz 
  • If you score lower than a 50%, attendance at the weekly synchronous session is mandatory before you may attempt the alternate version of the quiz.
  • And on and on.  
This could be similar to a choose your own ending book (remember those?), except the students are following their own, personalized learning path.  This way the students are getting exactly what they need when they need it.  This would actually be an interesting method to try with my flipped students....

Evaluate: Rubrics and Competencies Quest

posted May 24, 2016, 11:51 AM by Emily Kroutil   [ updated May 24, 2016, 12:04 PM ]

Competency-based learning is what I've been trying to accomplish in my flipped classroom.  I just didn't know that's what it was called!  I always called it Mastery-based learning.  This type of learning shifts the focus of a course to what the students have actually learned and away from things like seat time, credits, etc.  This is what I try and do in my own classes (with varying degrees of success...)!

The projectile unit in physics is based on the following GPS for Physics:
As with most of the GPS, the underlined sections (f, g) are quite broad, so I created learning objectives (competencies) that broke the broad elements of the GPS into more manageable chunks.  Then, I created organizational guides for my students with the learning objectives, resources they can access to view and practice the content and then they quiz at the end of each section.  My Organizational Guides (basically lists of competencies/learning objectives I created based on the GPS) looks like this:
The sections are divided into learning objectives (usually 2-4 per section) and the quizzes are also divided by learning objective (competency) so I can see exactly where the students are struggling.  I made short quizzes (4-9 questions), so the students couldn't just "pass" the quiz by getting the majority of the questions correct and still lacking mastery of a learning objective.  

 I now realize that this is similar to competency-based learning.  My learning objectives are basically competencies (things I want the student to be "competent in" or "master) and the quizzes are the assignments used to demonstrate competency (mastery).  To expand on this, I could create a rubric like the following and use that to "grade" their work (assignments labeled "Practice" in the organizational guide above, lab write ups, and anything related to that learning objective):

The items on the left of the rubric are the competencies (learning objectives) and the numbers are their level of competency with the skill.  I would suggest at least a 3, but preferably a 4 or 5 would be required to demonstrate competency (mastery).  If they can demonstrate competency (mastery) on the Practice problems, then they could be exempt from the quiz.  Alternatively, if they wanted to prove their competency (mastery) by doing the quiz, they could do that as well (Some students do not need to work the practice problems to "get" the concept and these students would benefit from being able to basically "quiz out" of that section.  If they wanted to complete the lab related to that learning objective and use that as their evidence of competency, they could do that as well.  This method allows students a variety of methods to demonstrate competency and also gives them more ownership of their learning.

As I spent more time with this rubric, I would probably find ways in which I would like to tweak or adjust it.  But, until I use it, I won't get a good idea of how it is lacking or what I should adjust.  Thus the problem with trying something new: that first class tends to be the guinea pigs.  Usually I let them know, "This is the first time I'm trying this, so let me know what is working and what isn't so I can adjust and adapt as we go along."

Evaluate: Data Driven Instruction, Analytics, Reporting Tools Quest

posted May 24, 2016, 8:36 AM by Emily Kroutil   [ updated May 24, 2016, 8:39 AM ]

The great thing about teaching a completely online course is that the software often gives you excellent data about your students and their progress.  As an instructor of one of these courses, you can see a wide-variety of data.

The instructor can view enrollment and attendance.  You can see how many students are enrolled in the course and the length of the course.  You can also see demographic information about each individual student.  You can see who is responsible for paying for the course, whether they've taken the orientation course, who has dropped the course, and what sort of EOCT the students are taking if they are taking one.  In the example in the quest, the school was responsible for paying most of the students' fees, all had taken the orientation, no one had dropped (yet), and all were taking the EOCT at the end of the course.

Instructors can also keep track of communication with stakeholders.  In each instance, the instructor sent a welcome call via Dial My Calls to the student.  In one example, the instructor used Dial My Calls to call the parent about a failing grade.  It seems like in this instance it might have been helpful if the instructor had sent the welcome call to the parent as well as the student.  The student in this example ended up dropping the course, which would have changed the "drop" information in the previous paragraph.  I liked how this software recorded the date and time of the contact.  It is an excellent way to CYA.  In the second example, the instructor sent a Happy Gram email to the parents of students that did well in the course.  I love this idea!  Sometimes, I will have students that ask me to please email their parents a Happy Gram about their grade on a test or simply work ethic in class.  I happily oblige!  Whatever it takes to get the kids engaged and learning the material!

The LMS allows instructors to see how the students are progressing through the course, how often they access the course, how long they are spending online, and what they are doing.  This information is useful to see exactly what the students are doing online.  Are they spending their time wisely?  Are they procrastinating?  Are their habits causing a low grade (or a high grade)?  For example, if the students are doing all of their assignments right before their summative assessment, it might be useful to  have staggered due dates for assignments so the students cannot procrastinate to the last minute with their work.  Some students aren't particularly good at pacing themselves, so this might help those students.  I would, however, make the assignments available early, for those students that like to work ahead (I'm one of those students...).  This might allow students to work at relatively their own pace, but also provide some structure for those students that have difficulty with the unstructured nature of online classes.

The LMS also allows the students to look at their own data, which is quite nice.  The students can view their grades on assessments AND personalized feedback from the instructor.  This is nice because it allows the students to know why they lost points if they lost points, but also lets the student know that the teacher read their work.  I have to admit, I appreciate that the verifiers in this course provided detailed feedback for these blog posts.  It lets me know that I'm not wasting my time creating these entries and that at least one person will read them.  The students feel the same way about comments on their work.  Students can also look at questions they have missed on assessments, which is great because they can see where they are struggling and practice those specific types of questions a little more.  Reviewing these assessments would probably also be useful when preparing and reviewing for summative assessments.  It would probably be a good idea to allow the students to access their assessments AFTER the due date so "collaboration" (i.e. cheating) is more difficult.

At the beginning of the course, it would be useful to add instructions on how to access their own data in the orientation information.  It would also be useful to post the instructions on how to access their data somewhere on the LMS where they can view it easily.  This way, if they forget how to access this information, they can see how to access this information.

Evaluate: The Summative Assessment Quest

posted May 24, 2016, 5:51 AM by Emily Kroutil

For this blog entry, I am going to use a paper test I created for my students as an example.  I will just show the first page of the test, so as not to give away all the questions in case other physics teachers in my school are using similar questions/exams.  I will also explain how I would adjust the administration of the exam in order to ensure validity, reliability, and security in an online environment.

This particular test is from our Electricity unit.  I would have liked to use my projectiles unit for continuity, but I was on maternity leave during that unit this year and I'd like to use a unit that I taught both years for when we look at the data.  The following is the first page from the electricity test:
The first thing you should notice is that each question is tied to a particular learning objective in that unit.  This ensures that the students are being tested over the material they have learned.  If a question does not specifically relate to a learning objective from that unit, it is thrown out.  If the question seems to cover an important concept in that unit, but isn't tied to a learning objective, then I know I need to go back and adjust my learning objectives.  This is one way I ensure that my test is valid.  I also look closely at the test and make sure I don't have any unintended cultural biases in my test.  For example, my first year teaching, a physical science test had a question that basically asked, Why would motorists put sandbags in the trunk of their car when driving on icy roads?  For someone who lives in a place where it ices, they would know the answer.  However, my students did not know the answer because they'd lived in the South their whole lives.  They were getting the question wrong, not because they didn't understand the concept being tested but because the question had an unintended cultural bias.  When teaching online school, I would make sure to check the assessment questions against my the learning objectives for my course and make sure that they were aligned.

I don't often give the same test to the same student twice, so it is difficult for me to check reliability that way.  However, over the past two years, I've given this test to over 200 students.  When I look at a histogram of scores from both years I get the following:

These scores are very similar, only 2% change in average score.  One thing that is noticeable, however, is that the second year, there were more very low scores and more very high scores.  I can only guess that the very low scores were from students that knew it was the end of the year and that this was the last test before the final.  I let the students' final exam grade replace any test grade that quarter, so there are those students that blow off the last test because they have so much going on with AP exams and other state-mandated tests, and then buckle down and study for the final exam, knowing the final will replace this test grade.  I also told the students that I would not make them take the final if their year grade was over 95% as a thank you for working hard all year.  Some students were very close to this average and studied very hard for the last test trying to bring up their year grade enough so they didn't have to take the final.  Even with the lower grades, the average score was a little higher this year (electricity 2015 clone).  I'd like to say this was because I improved as a teacher, but it probably wasn't that at all. :/

When teaching online classes, I would look at similar data to make sure that my tests were reliable.  If the average scores were drastically different from year to year, that could suggest that my test is not reliable.  Of course, I would need to be teaching the course long enough to be able to have some of that data.  I could also look at the histograms and compare sections with each other.  If the sections have drastically different scores, this could also indicate a lack of reliability.

When administering a test in my classroom, I work hard at maintaining test security.  Tests are kept in a locked cabinet, students are only allowed to work on test corrections in my classroom under supervision, and I try to watch my students closely to make sure they are each doing their own work.  In the online classroom, I would make a time limit for my test, not allow students to look over their score and choices until everyone had completed the test, if at all, and try to have a large test bank to choose questions from.  I especially like software that lets you pick the questions and it scrambles the questions and gives the students a subset of the questions, so each student has a different question set.  Getting to know your students and their work is also crucial.  For example, when I was in college, my mom wanted me to write an essay for my brother who was in high school (no judgement, please).  I told her that I couldn't write the essay for my brother because he had been writing his essays all year.  His teacher would know his style, common grammar mistakes, etc.  If I wrote an essay and he submitted it as his, the teacher would know that it wasn't written by my brother because it would be completely different stylistically.  In that same way, I get to know my students and can tell if they have simply improved a great deal or something else is going on.  

Create: Define and Explain Learning Object Authoring Tools Quest

posted May 23, 2016, 11:31 AM by Emily Kroutil

I found five learning object authoring tools in my research.  A few of them I've experimented with and some of them I've used quite a bit.  

Three Free Tools:
  1. iBooks Author 
    • This app is free, but it does require you to have an Apple computer.  You can create multimedia books with videos, images, quizzes, etc. that can be viewed on an iPad.  I've created half of two different books for my students.  I wrote the content in student-friendly conversational format.  I got stuck in a couple places, however: 
      1. it takes a lot of TIME to write a textbook for students and teachers don't have a lot of time. 
      2. Finding creative commons media to use in the book that would not violate copyright.  Even though I wasn't planning on selling the "textbook", I didn't want to violate copyright law.  With enough time, this could be a viable option for presenting material to students, especially in a virtual school atmosphere.
  2. Google Drive
    • I've used this free suite of apps quite often.  You can create documents, presentations, and spreadsheets.  You can also use Google forms to create surveys or quizzes for students.  I've also created a bare bones spreadsheet and given my AP kids the links and they've filled in their individual data for an experiment and created a class data set in an easily accessible place.  The collaborative nature of Google Drive is nice because I can create a learning object template or a partial learning object and the students can complete the object, which gives them more ownership of the material.  Hopefully, this translates to a better understanding of the material.
  3. QuickTime
    • There is a paid version of Quicktime and a free version.  I use the free version.  Quicktime allows you to record your computer screen and narrate, if you'd like.  I used this software with the eInstruction software below to create my screencasts.  This is an Apple software, but it is available for all computers.  It would also be interesting to have students author their own screencasts explaining a concept.  Again, this gives them more ownership of the material and students that are teaching other students tend to comprehend the material better.  The videos could then be posted on the course page for other students to view.  After several classes/semesters of this, the teacher could have a large repository of instructional videos, all created by students, and many of which offered varying perspectives on the material.
Two Paid Resources:
  1. Camtasia
    • Camtasia studio is $179.00 and Camtasia: Mac is $75.  TechSmith does offer a 30 day free trial of Camtasia.  This is the same software Sal Khan uses to create his videos on Khan Academy.  I liked this software.  I liked it better than the eInstruction software below.  However, I'm cheap and even though I have a Mac, I couldn't quite justify the $75.  I already had the eInstruction software provided by my district.  However, if I were to start creating a large number of instructional videos, I'd probably bite the bullet and purchase Camtasia.  It could be used (obviously) to create instructional videos for students.  I wouldn't use it for students because I wouldn't want to ask my students to pay for that software (Although if it was a one-time assignment, I guess I could have each of them sign up for a trial account.  There is a learning curve, however, and I'm not sure it would be worth it for a single assignment.).
  2. eInstruction WorkSpace
    • It looks like eInstruction may have been purchased by or renamed Turning Technologies.  I'm not certain how much it costs because this software is usually sold to districts and not individual teachers.  So it's probably expensive, but it also probably comes with quite a few licenses.  Mine copy came with a Mobi Slate (link is to the UK site) that our PTA purchased for me.  This software is meant for interactive whiteboards.  However, I used it in conjunction with QuickTime and the Mobi to create instructional videos.  I liked this software because it allowed me to put graph paper as a background, could draw perfectly straight lines (something I CANNOT do, especially on a slate).  It also let me use a variety of colors, which was useful when I wanted to explain different parts of a problem or wanted something to stand out.  Again, this isn't something I would have students use to create objects, but is useful as a teacher to create instructional videos.
There are many other tools, such as, that have been posted on the Forums, but I wanted to stick to resources that I've personally used so I could provide a firsthand account of their usefulness.

Evaluate: Quality Feedback Quest

posted May 23, 2016, 8:44 AM by Emily Kroutil   [ updated May 23, 2016, 10:41 AM ]

This assignment (pg.27-28)  is my AP Environmental Science students' first foray into the critical thinking and experimental design that is necessary in science.  The lab report is also one of the first times they've been asked to write scientifically.  and we discuss how to write scientifically, we look at scientific papers as style and formatting examples, and we "write" an example scientific lab report as a class.  They submit these lab reports online and I comment on them.  This first lab report is especially important because it sets the stage for all future lab reports.  If I let something slide in this first lab report, they will continue to make those mistakes in the future.  Below is a lab report with my written feedback:

There are a few best practices (as stated in the quest) that can be evidenced in my feedback on this assignment:
  1. Customized feedback is provided that is not only encouraging, but propels the student to strive for better performance or for deeper thought and application
    • The feedback for this assignment is directly tailored to the student and what he or she did or did not do on the assignment.  I remind the student of sample works we looked at as a class and for certain items asked the student, "What makes you think this?" or "Why did you do this?"  Hopefully, this prompted the student to look at their work and think and provide justification for why they wrote what they wrote.  This is especially important in AP Environmental Science because the FRQs (written responses) on the AP exam often ask students to EXPLAIN or DESCRIBE and they often just IDENTIFY and I have to ask "why?"  Hopefully, my comments on this assignment at the beginning of the year help the students to start explaining "why" they write what they write in my class.  Additionally, providing the WHY often gets the students to think about what they are doing/writing on a deeper level.
  2. Student data drives the feedback provided as individual feedback, as well as the class as a whole
    • This particular assignment has feedback on what this particular student needs to work on.  When grading all assignments from all students, I can get a clear picture of what we need to talk about as a group.  For example, students often IDENTIFY and not DESCRIBE or EXPLAIN in assignments such as this one.  Since this is such an important skill to have as a scientist and as an AP Environmental Science student, we often spend some class time after this assignment working on DESCRIBING and EXPLAINING.  By looking at each student's work individually but also making notes on how the class is doing as a whole, I can use this data to reteach or spend extra time on a concept that the entire (or most) of the class is struggling with.
  3. Teacher is not only assessing current progress on individual items, but is also analyzing each student’s continuous progress from grade item to grade item and communicates the progress to all stakeholders
    • Because this is the first writing assignment that we do during the year, the feedback provided doesn't show continuity from assignment to assignment, but I make note of what concept or item the students are struggling with and in the next assignment, check to see if they are following my advice/feedback and whether their next assignment reflects that.  If so, that means they are taking the feedback in and using it to inform their next assignment.  If not, that could be because they ignored my previous feedback or still do not understand the concept.  Either way, I need to address the misconception.
  4. Instructor clearly strives to assist students meet and exceed expectations and considers feedback as the cornerstone of that success.
    • I work very hard to make sure my students have the feedback necessary to progress in their learning.  Whether it is written feedback like the feedback shown on this assignment, or verbal/written feedback by going over quizzes with my students, or simply just directing my students to a useful resource or affirming that they are doing the work correctly (I have so many students that just want me to watch them work so I can affirm that they are doing the problem correctly), it is important to me that I deliver timely feedback.

1-10 of 66