Engaging Students with Sapling Learning's Interactive Labs

Kelly Lancaster

November 11, 2013 - November 13, 2013
Abstract: 

Sapling Learning is an education company (saplinglearning.com) that provides online homework and instruction for the science disciplines. In addition to its learning platform, the company develops interactive labs that support the inquiry process. Each lab comes with suggested homework and clicker questions that probe student understanding of the concepts. This article will offer examples of integrating Sapling Learning's interactive labs with instruction to engage students inside and outside of the classroom. The article will also describe the principles that guide the design of the labs, with particular emphasis on design considerations for touch interfaces (1).

Article PDF: 

Online homework

Sapling Learning’s mission is to engage students and empower educators. We aim to empower educators by providing a course management system that automatically monitors student progress. We aim to engage students by providing online homework that gives instant feedback on their understanding of course content. The screenshot below shows how a question appears to students. Our library includes over 10,000 such questions.



The goal of this question is for students to compare different representations of molecules. They can rotate the 3D models, and they have the option to view a hint. Each molecule contains 4 or 5 bonds. The labels include distractors, such as shapes with less than 4 or more than 5 bonds. The question is also randomized so that some students will see SiH4, for example, while others will see SiF4 or SiCl4. This encourages students to collaborate meaningfully on their homework.

Immediate feedback

One feature of online homework is that students receive immediate feedback on their understanding after they submit an answer. This feature has been associated with improved student performance (2). At Sapling Learning, we have two modes of feedback for incorrect answers: specific and general. For the specific feedback, our authors predict common student errors and provide targeted responses. The image below shows an example of the specific feedback for the question above.



We include general feedback for those errors we cannot anticipate. This ensures that all students get some form of assistance. As shown in the image below, the general feedback for the question above includes a table that contains the formula of each molecule along with 2D and 3D models. This gives students more guidance toward the correct shape. Did students actually label the 2D model of CF4 as square planar? Read on for the results.

Interactive labs

Our products span the range from assessment to instruction. We recently launched a series of eBooks for high school science that include videos, 3D animations, and interactive labs. We used HTML5 to develop the content to enable use on multiple platforms. To date, we have created approximately 50 interactive labs for the subjects of physics, chemistry, and biology. Below is an example of one of the chemistry labs.


Click on the image to open a video of the lab.

The goal of this lab is for students to examine the effect of concentration of the pH of a solution. Students can add a common liquid, such as coffee or juice, to the beaker and use the probe to measure the pH. They can add water or open the drain and observe the effect on the color of the liquid. The tick marks along the side of the beaker also enable students to quantify the effect of dilution. In what follows, we describe our design process for the labs (3).

Design goals

The design of each interactive lab is guided by two types of learning goals: content and process. Our content goals are for students to develop a conceptual understanding of the science topic. Our process goals are to engage students in science by giving them an opportunity to ask their own questions and test their own hypotheses.

For the high school labs, our design goals also align with the Texas Essential Knowledge and Skills (TEKS), the state standards. Below is an excerpt from one of the science concepts (4).

TEKS Chemistry 10F: “The student is expected to investigate factors that influence solubilities and rates of dissolution such as temperature, agitation, and surface area.”

We developed two labs to address this standard: one for solubility, and one for dissolution rate. In both labs, students can dissolve fine or coarse salt or sugar in water, and they can use a hot plate to examine the effects of heating or stirring the solution. In the Solubility lab, students are given measuring spoons to compare the amounts required to reach saturation. In the Rate of Dissolution lab, students are given a timer. We performed the actual experiment ourselves to obtain relative dissolution times.





Our design goals are also informed by research: We consult the chemistry education literature for insight into student ideas. Our set of electrochemistry labs provides an example. In one of the labs, students can build a voltaic cell using half-cells and a salt bridge. We included an option to animate the flow of electrons from the anode to the cathode to confront the student ideas about current flow reported by Sanger and Greenbowe (5).

Implicit scaffolding

We apply the idea of implicit scaffolding (6) to create environments that enable students to ask their own questions. This allows us to guide students while giving them a sense of autonomy. A common example of this idea comes from door design: A door that people must push to open should not have a handle that people can pull. Below we use the Specific Heat lab to illustrate our use of affordances and constraints.



The goal of this lab is for students to plan a procedure to determine the specific heat of a metal. Students can drag the cup, the metal block, and the thermometer. Students can also select and identify a mystery metal. They can use the reset button to repeat an experiment.

This lab affords certain actions. The water dropper is poised above the cup to cue students to add water. Likewise, the metal block is poised above the burner to cue students to heat the metal. The balance and the thermometer cue students to measure mass and temperature. Even the dropdown menu cues students to compare the metals.

This lab also constrains certain actions. For example, students can only add water and heat once per experiment. Students are also not able to drag the metal block out of the cup. Part of the reason is to simplify the model, but the main reason is to encourage productive experimentation.

Interface testing

We use two types of user experience testing to ensure that an interface is intuitive: hallway and online. For hallway testing, we literally walk around the Sapling Learning office and ask coworkers to think aloud while using the lab. We occasionally use a screen-capture program to enable us to revisit the tests. After about three users, we begin to observe common interface issues and interpretation errors.

For online testing, we employ a service (UserTesting.com) that provides on-demand usability testing. We create the test and they recruit the testers. Within an hour, we can watch videos of people using the lab and read their responses to our follow-up questions. Here we use the Atom Builder lab to give an example of the feedback.


Click on the image to open a video of the lab.

In this lab, students can build atoms with protons, neutrons, and electrons. They can examine the effect of each particle on the identity of the atom, the charge, and the mass number. The nucleons shake when students build an unstable nucleus. The electrons move so rapidly in the play area that they are hard to locate. Students can click outside the nucleus to remove an electron from the play area. Our representation of the atom was inspired by a Nature article on hollow atoms (7).

For the online user testing, we set up the scenario: “You are a student in an introductory science class. Your teacher has asked the class to use an online lab for homework.” The first task was to explore the lab for a few minutes. The next few tasks were more specific. In the video clip below, one of the testers is using the lab to answer: “What changes the number after the name?”


Click on the image to open a user testing video.

Note how she uses the lab to test her predictions. “Is it the number of electrons? Let’s try it.” The immediate feedback in the lab allows her to develop a rule for the mass number. “I think the number shows how many protons and neutrons are in the atom.” Then she uses the lab to demonstrate the rule. This example suggests that students can learn from the lab without explicit guidance.

Iterative process

We use the results of user testing to make changes to the lab. For example, in the hallway testing for the Atom Builder lab, we saw that some users thought the way to make the nucleus stable was to add the nucleons in the correct order. The image below shows that the locations of the nucleons no longer depend on the order in which they are added to the nucleus.

In the online testing, we saw that some users equated stable and neutral. As shown in the image below, we added the word “nucleus” to the stability readout. We also changed a phrase in the Help text from “build as many stable, neutral atoms as possible” to “stable and neutral atoms”.

In both forms of interface testing, we saw that users were hesitant to click on the Help button. This was particularly the case for male users. The image below shows that the Help button is now an “Info” button, but we are still exploring other ways to make our Help buttons less intimidating.



We also try to address issues in the questions that we write for the lab. Below is a screenshot of one of the questions for the Atom Builder lab. It asks students to classify each atom description as stable and/or neutral to confront the idea that stable and neutral are equivalent.



Note that we encourage students to use the lab to answer the question, as they are not expected to know which combinations of nucleons result in a stable nucleus. The items are also randomized to promote meaningful collaboration. In summary, user testing informs both the design of the lab and the questions that we ask about the lab.

Engage students in lecture

The open design of the interactive labs enables use in a variety of educational settings. Below are two examples of using the Conductivity lab with clicker questions in a lecture environment. The goal of this lab is for students to compare different types of solutions. They can drag the electrodes into a solution to measure the conductivity and view the particles in solution.



One type of clicker question that lends itself well to an interactive lab is a prediction question. For example, an instructor can ask students to predict which light bulb below shows the result when the electrodes are placed in one of the solutions. The instructor can collect student responses and then use the lab to show students the result.



Another type of clicker question is one that generates critical discussion of the interactive lab. For example, water molecules are not shown in the Conductivity lab. An instructor can ask students to select the zoom view below that shows the best representation of water. The first option shows a macroscopic representation, the second shows water molecules floating in water, and the third shows water molecules tightly packed. The instructor can collect student responses and then facilitate a classroom discussion about particulate models of solutions.

Engage students in lab

Our interactive labs are designed to support the inquiry process. For example, the goal of the lab below is for students to investigate precipitation reactions. Students can mix two solutions of ionic compounds and observe whether a precipitate forms. They can also use the results to identify an unknown solution. An instructor can use this lab to ask students to construct the solubility rules rather than to confirm the rules.



Each interactive lab comes with suggested questions. Many of the questions that we write for the labs ask students to collect and analyze data from the lab. An instructor can use this resource to prepare students for the experience of a wet lab. The eBooks for high school also include lab videos. Below is a still from a video about precipitation reactions. An instructor can use this resource to ask students to contrast the physical reaction with its virtual representation.



The interactive labs provide a safe environment for experimentation. Some of our labs contain reactions with safety or waste issues. Others allow procedures that would require equipment that a high school science department may not own. In this way, our interactive labs can enhance and extend the wet lab experience.

Engage students at home

Not surprisingly, an instructor can use our interactive labs with online homework. One strategy is to introduce a concept at home so that students are prepared to apply the concept in class. Many of our questions ask students to notice relationships in the lab. Other questions, like the example for the Atom Builder lab, prompt students to construct a working definition of science terms. Below is a screenshot of a question for the pH lab that asks students to examine how adding water or opening the drain affects the pH of each solution.



We include a link to the interactive lab in the question stem. Since the interactive labs are not randomized, we randomize the questions associated with the labs to encourage students to collaborate. In the example above, the solutions in the table are randomized so that each student is likely to get a different set. We also encourage students to use the lab to answer the question. An instructor can use the same strategy with other online resources and other homework systems.

Engage students anywhere

We develop the interactive labs in HTML5 to enable students to use the labs on any device. This means that we must ensure that our labs work on laptops and tablets, in multiple browsers and platforms. This also means that we must consider touch interfaces in the design of the labs.

The pH lab provides one example of a design consideration. We often use cursor changes to signify when an object is interactive. As shown in the image below, when a student hovers over the button on the water bottle, the mouse cursor changes from a pointer to a hand. We occasionally show a tooltip, such as “add water”, when a student hovers over an interactive object. Both of these cues are not possible on touch interfaces, so we must rely on artistic effects.



The Density lab provides an example of another design consideration. The goal of this lab is for students to design an experiment to determine the density of a spherical object. They can use the balance to measure the mass, and they can use the ruler or water displacement to determine the volume. Students can compare two objects of the same material with different sizes or two objects of different materials with the same size. They can also identify a mystery object.


Click on the image to open a video of the lab on a tablet.

We used a tablet for hallway testing and saw that it was difficult for users to measure the diameter of the small objects because their fingertips covered the objects. In the next iteration, we made the ruler transparent and added the ability to drag the ruler over the objects.




We did a horizontal flip of the interface for another lab after we saw that users’ hands covered the features below their fingers. Touch interfaces are important to consider early in the design process, and user testing on tablets often reveals usability issues.

For teachers: Customization

Sapling Learning pairs each instructor with a “Tech TA”, a subject expert who provides support throughout the semester. One way that our Tech TAs support instructors is by editing our existing content. We also plan to offer customization of the interactive labs. For example, we can add or remove chemicals from a lab to better align with a particular experiment.

Below is a version of the Density lab with objects removed. In this version, the objects are made of the same material. The material is one that has a range of density values. The largest rock does not fit inside the graduated cylinder, and it also maxes out the balance. Students can use the ruler and the average density of the other rocks to determine the mass of the largest rock. Another version of this lab could include rocks with other shapes.



Below is a version of the Density lab with objects added. This version includes an object that floats in water and an object with an irregular shape. Students can use the ruler to determine the volume of the wood ball, and they can use water displacement to determine the volume of the gold nugget.



Another way that our Tech TAs support instructors is by developing new content to address specific learning goals. We plan to provide the same level of support for our interactive labs.

For researchers: Data

Over 200,000 students are using Sapling Learning this semester. This means that many of our homework questions are attempted by thousands of students. Our homework system tracks every incorrect response. We already use this data to gauge difficulty and to ensure that students are getting our specific feedback. Many instructors use this data to shape their lectures.

An education researcher can use this data to uncover common student ideas or to compare student ideas before and after a learning experience. Below is a screenshot of a question that an instructor may assign during a unit on density. The question asks students to use the intensive nature of density to predict the behavior of a small metal block.



More than 1000 students have attempted this question. With the targeted feedback, nearly all students correctly place the metal at the bottom of the beaker. The student paths are even more interesting. The image below shows the five patterns followed by 98% of the students. We see that only 84% of the students were correct on their first attempt. We also see that students were more likely to place the metal in the middle of the beaker than to place it on the surface of the water.



Here we return to the opening question: Did students label the 2D model of CF4 as square planar? Of the roughly 1000 students who attempted this question, only 6% selected this on their first attempt. On the other hand, 19% of students labeled SF4 as tetrahedral on their first attempt. Given the specific feedback that the central atom has one lone pair, most students went on to successfully complete the question.

Next steps

We are currently working on math labs for a new high school product in addition to developing new interactive labs for science. We are also exploring ways to offer free access to the interactive labs for students and teachers. One plan is to form a community where we can share labs in progress and get feedback on the design. We hope that you are able to apply one of the ideas in this article in your own work, and we look forward to your comments.

References

(1) This article is based on a presentation given by the author:
Lancaster, K. Engaging students with Sapling interactives. Presented at the 246th ACS National Meeting & Exposition, Indianapolis, IN, September 8-12, 2013; Paper CHED 404.

(2) For an independent case study using Sapling Learning, see:
Parker, L.L.; Loudon, G.M. Case Study Using Online Homework in Undergraduate Organic Chemistry: Results and Student Attitudes. J. Chem. Educ. 2013, 90, 37-44.
http://dx.doi.org/10.1021/ed300270t

(3) For more on challenges unique to the design of interactive chemistry simulations, see:
Lancaster, K.; Moore, E.B.; Parson, R.; Perkins, K.K. Insights from Using PhET’s Design Principles for Interactive Chemistry Simulations. In Pedagogic Roles of Animations and Simulations in Chemistry Courses; Suits, J.P., Sanger, M.J., Eds.; ACS Symposium Series, Vol. 1142; American Chemical Society: Washington, DC, 2013; pp 97-126.
http://dx.doi.org/10.1021/bk-2013-1142.ch005

(4) Texas Administrative Code, Title 19, Part II; Chapter 112. Texas Essential Knowledge and Skills for Science; Subchapter C. High School.
http://ritter.tea.state.tx.us/rules/tac/chapter112/ch112c.html

(5) Sanger, M.J.; Greenbowe, T.J. Students’ Misconceptions in Electrochemistry: Current Flow in Electrolyte Solutions and the Salt Bridge. J. Chem. Educ. 1997, 74, 819-823.
http://dx.doi.org/10.1021/ed074p819

(6) For more on the application of implicit scaffolding in simulation design, see:
Podolefsky, N.S.; Moore, E.B.; Perkins, K.K. Implicit scaffolding in interactive simulations: Design strategies to support multiple educational goals. Submitted to J. Sci. Educ. Technol.
http://arxiv.org/abs/1306.6544

(7) Van Noorden, R. Bohr’s model: Extreme atoms. Nature 2013, 498, 22-25.
http://www.nature.com/news/bohr-s-model-extreme-atoms-1.13118

Acknowledgements

The author would like to acknowledge the content and art teams at Sapling Learning. In particular, she would like to thank Jeff Sims, our interactive developer and animator. The author finds it a funny coincidence that she used to design “sims” for the PhET project and that Jeff lives in Lancaster, PA.

Comments

Comment for ConfChem Article: Engaging Students with Sapling Lea

Hi Kelly,

Your presentation is interesting. How does your system help students learn how to go from gB->molA->molB->gA.  How does learning with your system compare with the average?

Thanks,

Brian

Data Collection

This is really nice stuff, and I can identify with the issues (both pedagogical and technical) associated with authoring the content. 

You are obviously collecting data associated with the student responses to questions. Do you have a system that allows your teacher users easy access to this data (% right answers and wrong answers chosen)?  If so, do you have any evidence as to the extent to which your teacher adopters are using this facility?

Also do you have any way of collecting data of how the students en masse use the simulations?  Does the on-line testing that you describe in the iterative process mean that you are actually analysing how the simulation is used OR are you surmising how it was used from the answers to the questions that are asked?

There is sooo much potential for the data from this type of exercise to give us clues about where students are missing the point (and even better, analysing the data from questions often makes it obvious that the deficiency was only partly with the student).   Do you have a program of regularly reviewing student responses in order to decide whether the question is being asked in an appropriate way?

 

Data Collection

Hi Sheila,

Thank you for the nice comments and for your interest in the data.

1. We do have a system that allows instructors to access statistics for each activity. Our gradebook shows the percent of students that attempt a question along with the percent that answer the question correctly. It also shows the number of attempts and average score for each question. We do have survey data that suggest that about half of our instructors use these statistics to identify problem areas, while the other half either do not use or are not aware of this feature.

2. For the interactive labs, we only have access to the answers that students submit for the lab questions. The online testing service does provide think aloud videos that allow us to examine how users interact with the lab.

3. We do analyze this data to identify problem questions. Our goal is for every student to succeed on their homework. We collect student data for each question, and we revise questions that appear to be outliers. One metric that we use to gauge difficulty is the average number of attempts. Another metric that we use is the number of times the default (or general) feedback is triggered. We look for a number less than or equal to one to ensure that students are getting targeted feedback.

Thanks again,
Kelly 

Logistics

Hi Kelly,

Thanks for sharing the article and the screen shots make the simulation seem very user friendly.  As for the lab, I do have a couple logistical type questions:

1.How do you and sapling learning make the case that this simulation is different from all the rest (e.g. ChemCollective, ChemLab, VirtLab, etc.)?  Is it the interactive feature with instant feedback that makes "yours" better than "theirs"?

2. I work primarily with secondary education and I'm curious what would a typical cost be for an instructor or school to have access to the simulations that are able to collect data?

3. Does each lab simulation have the capability for instructors to write questions?

Cheers,

Chris

Logistics

Hi Chris,

Thank you for the nice comment about the interactive labs.

1. This is probably a question for the marketing team. My goal is to make a tool that gets students excited about science. There are many other resources available that can achieve the same goal. One possible added value is our integrated assessment of the labs. The company also values educator support.

2. For high school users, the labs are part of a larger eBook product. A school pays for access, and the price is based on the number of students. Note that the high school product is only actively sold in Texas, since it is aligned to the state standards. We are also considering ways to offer free access to the interactive labs.

3. Instructors can write their own questions in our homework system. We have a help page about authoring on our website, and we also offer training for those who are interested in writing more advanced questions.

Thanks again,
Kelly

adaptive learning capability

Hi Kelly,

It's interesting to learn how Sapling is developing adaptive learning capabilities with its tools. I wonder if you could address in greater detail how this adaptive process is informed? I gathered from your article that you are selecting a sample based on convenience and having them do think alouds. Is this correct?   In which case, I wonder how these people, your coworkers, compare with "traditional chemistry students"? Also, there is a study by Yeo, Loss, Zadnik, Harrison and Treagust (2004) that tells us that interactive media is promoted "as an effective and stimulating medium" yet they note that students do not always interact with multimedia as intended by the designers. It's interesting that you are able to chart how students meander their way through the density lab, but it makes me wonder what these students were actually thinking while they were engaged in this process? Does Sapling have researchers like you examine this? I guess I would like to better understand how Sapling informs the design of their tools. Could you elaborate on how the content and process goals are used to create tools from start to finish? It seems like the learning goals for the examples you listed are quite different. Do you educate the viewer on the intention behind the tools? 

As you can see, I have many questions! Thanks for sharing Kelly! 

Adaptive learning

Hi Resa,

Thank you for your interest and for all of the great questions.

1. By adaptive learning, are you referring to the targeted feedback in our homework questions? The feedback is informed by our experience, as well as by the results of similar questions. One example comes from a question that asks students to read the buret volume. We included the feedback to "read the volume level from the top down" based on our lab teaching experience. We also used student data from a similar graduated cylinder question to inform the feedback about the number of digits.

2. The quote from the Yeo study gives the main reason for doing user experience testing. We often uncover issues during think alouds with coworkers, who are members of the sales, marketing, art, content, and development teams. Of course, it would be more informative to recruit testers from the pool of intended users (chemistry students). The online testing service does allow us to specify user demographics, such as age, and it also better simulates the homework environment.

3. It would be very interesting to capture this thought process, but we only have access to the answers that students submit. For the density question, it is possible that some students were randomly guessing, since there are a limited number of options. Instructors have better access to their students and to the context of their classroom, and some have used student data to answer research questions.

4. The content and process goals guide both the design of the lab and the assessment of the lab. The content goals are specific to each lab, while the process goals are generally the same for each lab: to engage students in science. We do include an intro screen with the learning objectives of the lab.

Thanks again,
Kelly

Interactive Simulation Design: TouchScreen vs. Mouse and Monitor

Dear Kelly,

Thank you for sharing this work with us. I’d like to know a bit more on your thoughts concerning the variance of design for interactive simulations designed for use on a touch screen versus traditional monitor and mouse. You mention that in touch screens you do not have as many cues available and need to rely more on artistic effects. Are you able to gather information on who is using these different interfaces?  Is there a demarcation between user-groups that would influence the design of the simulations, like are high school students more likely to use touchscreen than college students?

Bob

 

Touch vs. Mouse

Hi Bob,

Thank you for the opportunity to share this work with this community.

We are finding that cues are less important when the labs are used on tablets. Interacting with objects directly on the screen (rather than indirectly with a mouse) seems to be very intuitive to users. In fact, some users have been able to “break” labs during the testing phase by attempting to interact with more than one object at a time, which is not possible with a single mouse pointer. It would be interesting to build multi-touch into the design of an interactive lab.

We are able to track the browsers and operating systems used to access our high school and higher ed servers. This does not give very useful data regarding tablets vs. desktops, since our homework system still requires Flash. But it is interesting to note that more high school users access the site with older versions of Internet Explorer that do not support the HTML5 canvas element, as compared to higher ed users. We also have survey data that suggest that almost half of our higher ed users own tablets.

Thanks again,
Kelly 

The mol concept and Avogadro number

Hi  Kelly,

Thanks for your interesting article.

I am curious about how have you selected the different topics related to each simulation. Though the number of possible topics is huge I wonder if you have considered any simulation oriented to illustrate the mol concept and the Avogadro number. Every time I ask schoolteachers about difficult concepts for their students this is one of the most frequently mentioned.

Thanks

Pascual 

 

Forwarded Comment from Barbara Petelenz

Dear All,

As a 65-year-old person from a distant European country I would like to comment on 'difficult concepts' in chemistry.

When I was a teenager the concept of mole was probably the easiest and most 'user-friendly' not only for myself but for the majority of my classmates. This opinion has been shared by practically all professional chemists of more or less my age.

Surprisingly, in the generation of my daughter who started to learn chemistry in mid-1990-ies, the majority of her classmates found the mole concept difficult. My teenager nephews and their classmates share this sad opinion as well.

Conclusion?

Evidently, the way of introducing the concept of mole has deteriorated sometime between 1962 and 1990. In my opinion, the culprit is the excessive strive for accuracy, imposed by the more and more refined IUPAC definitions. For example, today's children cannot guess why they have to deal with 1/12 of the atomic mass of C-12, because they need to know the concept of isotopes, mass defect, etc.

But when I told my nephew that he has just to match a 50kg bag of nuts with a 48kg bag of bolts he suddenly understood.

Best regards,
Barbara

 

Forwarded Comment from Jim Carr

Regarding the mole concept.  I still have my high school and freshman university chemistry books.  Both introduce the mole through the concept of "gram molecular weight".  Both books get to Avogadro's number after introducing gram molecular (atomic) weight. That worked for me but I don't find that phrase in modern freshman books.
Jim Carr

Why students find Chem difficult

To see why students have trouble with the mole concept (or balancing equations), try this test:

Ask several students individually to answer  8 x 6  and   7 x 8    without a calculator.  See how long it takes them.  See what percentage get them right.

Or -- give them some simple chemical equations to balance -- without a calculator.

Students in most US states since 1990 have been taught under state K-12 math standards that required the use of calculators to do arithmetic starting in THIRD grade.  As one result, they don’t have much sense of numeracy.  Chemistry is based on simple whole number ratios, but many in this generation were not taught that they needed to know those ratios, so for them, chemistry is not simple.

The best predictor of success in 2nd semester Gen Chem  has been found to be?  Simple math without a calculator.  A good paper on this by Leopold and Edgar:  J. Chem Ed. (85) 724

 

Mole concept

Hi Pascual,

Thank you for your interest in the article.

Most of the topics for the interactive labs were selected to address the Texas high school standards. Moving forward, we plan to focus on topics that are requested by instructors. We will certainly add the concept of moles to this list.

Thanks again,
Kelly 

Development environment

Very impressive work! 

Are your simulations devivered by stand-alone sofware installed on individual computers or by a walled Web-based system? You mentioned you use HTML5. Can you tell us someting about the programming environment used by your programmers? I've always neen interested in simulations and had developed and used a number of them for older students bach in the 1980s and '90s. But a problem that I've always had is that development environments and the code they produce eventually break and it becomes impossibe to port to newer hardware without a complete re-write.

Development environment

Hi Tom,

Thank you for the nice comment on this work.

We decided that our developers could best address your question about the programming environment. Below are their answers.

Jeff Sims:
"The interactives are essentially web apps, written in HTML5 with JavaScript, CSS, and jQuery among other JavaScript libraries. They can run in the browser as web apps, or as HTML5 widgets for example in Apple's iBooks.

As far as programming environment, I use Sublime Text which is really just a fancy text editor, and use the Adobe Creative Suite apps for design and art elements. Though a few examples in the article were done in Flash (the question examples)."

Verna Hartinger:
"I use a slightly different environment in that I use an "IDE" (Integrated Development Environment) called Webstorm (produced by a company who makes a popular IDE for Java developers). It is like a fancy editor with some helpful features for JavaScript and CSS coding including some built-in error catching and code efficiency hinting. I use Flash with its Professional Toolkit for CreateJS sometimes as a way of outputting display objects for the CreateJS/EaselJS framework, (the latter framework being a tool to facilitate addressing objects being drawn to the HTML5 canvas to allow interactivity and animation). I also use jQuery and sometimes TweenLite (for animation sequences) - the latter being a tool that has been used by Flash developers for years and having just been made available for JavaScript - the point being that is has been around for a while. The thing about HTML5 development is that it could theoretically be done with a plain text editor (although that would not be very efficient) so you don't really have to be too concerned about the tools becoming obsolete. The issue then becomes the fact that the environment you are targeting (target browsers) is changing every five minutes, as are user interface standards."

Thanks again,
Kelly