Tuesday, November 27, 2007

Collecting and Analyzing Data

Lee/Owens says there are five activities required in data collection and analysis: 1) Set up the database, 2) Develop an evaluation plan, 3) Collect and compile the data, 4) Interpret the data, and 5) Document your findings.
In step one, you need to decide what software and/or hardware you will need for your database and set it up. In step two, you need to plan out how you want the evaluation to take place, figure out how many people you think need to be envolved, figure out how to keep confidentiality of your subjects, and figure out how much it will cost to implement. In step three, you start the evaluation and collect and compile the information as subjects finish. In step four, interpret the data to see which or not the goals of the evaluation have been met or not. Next, determine wheither or not the evaluation was a success and should be used again, or if it was a failure and needs to be changed before further implementations are made. In step five, once again always document everything you do and the reasons for why you did what you did.

Lee, W. W., & Owens, D. L. (2004). Multimedia-based instructional design: Computer-based training, web-based training, distance broadcast training, performance-based solutions. San Francisco, CA: Pfeiffer.

Measures of Validity

Lee, Roadman, and Mamone (1990) recommend a process from establishing validity that has three activities: 1) Determine the level and type of validity required, 2) Determine when to validate measurement instruments, and 3) Document your decisions.
Different levels of validity include: low, medium, and high. Different types of validity include: face, content, concurrent, construct, test item, predictive, and inter-rater agreement. When determining when to validate measurement instruments you must decide which phase(s) throughout your design process will give you the best results you are looking for, or which phase(s) will show you what changes need to be made the best. Always remember to document when and why you did what you did.

Lee, W. W., & Owens, D. L. (2004). Multimedia-based instructional design: Computer-based training, web-based training, distance broadcast training, performance-based solutions. San Francisco, CA: Pfeiffer.

Monday, November 26, 2007

Evaluation Plan

Lee/Owens says that there are five steps for creating an evaluation plan: 1) Complete the problem statement section, 2) Complete the solution section, 3) Complete the objectives section, 4) Complete the components of the evaluation plan for each level at which you will evaluate the project, and 5) Complete the executive summary.
In step one, you want to define what the purpose of this plan is and why it is important to be fixed or discussed. In step two, you want to give some ideas on how to fix the problem. In step three, you want to discussed how you want the outcomes in the end to be and how your suggested ideas will get you to that end goal. In step four, you need to complete the following levels that are appropriate for your plan.
Lee/Owens list these levels and actions for each level:
Level 1: Reaction
-Survey construction -Data collection -Data analysis -Expected results -Reporting results
Level 2: Knowledge
-Test construction -Data collection -Data analysis -Expected results -Reporting results
Level 3: Performance
-Observation study construction -Data collection -Data analysis -Expected results -Reporting results
Level 4: Impact
-Study construction -Data collection -Data analysis -Expected results -Reporting results
In step five, you want to restate your problem and briefly rediscuss the purpose of this plan. Then you want summerize and discuss all of your findings from this plan.

Lee, W. W., & Owens, D. L. (2004). Multimedia-based instructional design: Computer-based training, web-based training, distance broadcast training, performance-based solutions. San Francisco, CA: Pfeiffer.

Evaluation Strategy

Lee/Owens says that there are three activities to complete to form an evaluation strategy: 1) Write an introduction, 2) Determine the requirements to evaluate the results, and 3) Determine what your source of information will be from which you draw your strategy.
The introduction of your evaluation should discuss what the evaluation is for and some background information about what happened. When determining your requirements you want to figure out what parts of the event you are evaluating will be evaluated and how to measure the results. You will also need to figure out how you want to collect and analyze the data you want to receive. Finally, in the third activity you know to site the resources that you used during the event you are evaluated and any resources you may have used to create the evaluation process itself.

Lee, W. W., & Owens, D. L. (2004). Multimedia-based instructional design: Computer-based training, web-based training, distance broadcast training, performance-based solutions. San Francisco, CA: Pfeiffer.

Monday, November 12, 2007

Purpose of Evaluation

Lee/Owens says that the purpose of evaluation procedure has three steps: 1) Determine whether the measurement variables are organizational or individual, 2) Identify the measurement variables that are individual rather than organizational, and 3) Determine whether the solution will be used commercially.
The purpose of evaluation on the organizational side is to determine if the company is meeting the needs of the people who are buying their products, and to see if their money is being spent in the right way so that they are making the mosy profit and still stay in business.
The purpose of evaluation on the individual side is to determine if that person is doing well at their job, how is he/she compared to others in the company, and where does the boss and that person see he/she in the future (ie. higher job positions or lower job positions).

Lee, W. W., & Owens, D. L. (2004). Multimedia-based instructional design: Computer-based training, web-based training, distance broadcast training, performance-based solutions. San Francisco, CA: Pfeiffer.

Introduction to Multimedia Evaluation

According to Lee/Owens there are four levels of evaluation: Level 1 Reaction, Level 2 Knowledge, Level 3 Performance, and Level 4 Impact.
Reaction measures the response of the trainee based upon how they feel the training will help them be better at their job. Knowledge measures the level of achievement of the course and skills for which the course was for. Performance measures changes in the student as a result of using what was learned in the course on their job over a period of time. Impact measures the businesses return on investments based on the skills that were learned from the training.

Lee, W. W., & Owens, D. L. (2004). Multimedia-based instructional design: Computer-based training, web-based training, distance broadcast training, performance-based solutions. San Francisco, CA: Pfeiffer.

Monday, November 5, 2007

Asian professionals prefer training to big bucks

Asian professionals are starting to value training more than money. "With the global war on talent showing no sign of relenting, organisations are striving to balance employee value proposition with more dollars on training and less on base pay to attract, retain and engage employees," Srikanth said.
I wonder how many people sign up for a job just to get expensive training and then leave.


http://in.news.yahoo.com/071102/48/6mqoj.html