Title | McCloud, Carrie_MPC_2013 |
Alternative Title | Effects of Interior Design on Computer-based Testing at Weber State University |
Creator | McCloud, Carrie |
Collection Name | Master of Professional Communication |
Description | This is a study about how interior design factors affect student test outcomes in computer-based testing at Weber State University. Research (Brooks, 2011) demonstrates that the physical learning environment has a significant measurable impact on learning outcomes. Researchers and educators (Stone, 2008) are calling for more research in designing educational spaces. Designing positive experiences is important because emotions affect how the cognitive system operates and aesthetics can affect the emotional state (Norman, 2004). About 900 WSU students were surveyed about their perceptions of the interior design of the testing centers and its effect on their test performance. The study showed that the majority of students perceive the design of the testing centers to be good to excellent, but they noted close proximity to others, noise, and temperature fluctuations distract during test taking. An observational interior design analysis of five testing centers at Weber State University concurred that the centers are well designed but could be improved in how space is used. An ANOVA on a randomized sample of 1000 student test scores showed no statistically significant difference in student test outcomes by testing center. |
Subject | Communication and technology; Communication--Research; Interior decoration; Educational tests and measurements |
Keywords | interior design; testing center; computer-based testing; human factors; environmental design; educational assessment; human-computer interaction; higher education |
Digital Publisher | Stewart Library, Weber State University |
Date | 2013 |
Language | eng |
Rights | The author has granted Weber State University Archives a limited, non-exclusive, royalty-free license to reproduce their theses, in whole or in part, in electronic or paper form and to make it available to the general public at no charge. The author retains all other rights. |
Source | University Archives Electronic Records; Master of Professional Communication. Stewart Library, Weber State University |
OCR Text | Show 1407 University Circle, Ogden, UT 84408-1407 2013 Effects of Interior Design on Computer-based Testing at Weber State University Master of Professional Communication Thesis Carrie McCloud Weber State University, Department of Communication Author’s Note Carrie McCloud conducted this research as a candidate in the Master of Professional Communication program at Weber State University. This study was prepared to satisfy the thesis requirement for the degree. Sheree Josephson, Ph.D., chair of the Department of Communication, and Luke Fernandez, Ph.D., manager of Online Development, served as thesis advisors under the supervision of Kathy Edwards, Ph.D., director of the Master of Professional Communication program. Questions or comments regarding this paper may be directed to Carrie McCloud, c/o Department of Communication, Weber State University, 1407 University Circle, Ogden, Utah 84408-1407. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 2 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 3 ACKNOWLEDGEMENTS Several people assisted me in completing this project and I am grateful for the generous sharing of knowledge, expertise, time, and support that each one gave to me. To my advisors, Sheree Josephson and Luke Fernandez, thank you for the many hours you spent reading and rereading my drafts, editing my prose, guiding the form and process of the project, and for your patience with my many questions; I am in awe of both of you. To Chi Tester programmer, Adam Trost, thank you for all the time you spent modifying the code and troubleshooting the bugs created by accommodating my little survey modification. To my consultants, Jeff Willden and Heather Chapman, thank you for your insight, for enlightening discussions, and for your assistance in pulling data and running the stats. To WSU Fire Marshal, Dennis Montgomery, thank you for giving up several hours of your holiday to accompany me on my rounds. To my good friends, Donna Hernandez and Cory Cunningham, thank you for more than I can possibly note on a single page. Thank you for hours spent at the coffee shop, for fun and lively conversations, for proofreading, for happy little instant messages when I was discouraged. Your friendship, support, laughter, and willingness to listen to me think out loud are valuable to me beyond measure. You rock! I’m glad we’re friends. Carrie McCloud April, 2013 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 4 ABSTRACT This is a study about how interior design factors affect student test outcomes in computer-based testing at Weber State University. Research (Brooks, 2011) demonstrates that the physical learning environment has a significant measurable impact on learning outcomes. Researchers and educators (Stone, 2008) are calling for more research in designing educational spaces. Designing positive experiences is important because emotions affect how the cognitive system operates and aesthetics can affect the emotional state (Norman, 2004). About 900 WSU students were surveyed about their perceptions of the interior design of the testing centers and its effect on their test performance. The study showed that the majority of students perceive the design of the testing centers to be good to excellent, but they noted close proximity to others, noise, and temperature fluctuations distract during test taking. An observational interior design analysis of five testing centers at Weber State University concurred that the centers are well designed but could be improved in how space is used. An ANOVA on a randomized sample of 1000 student test scores showed no statistically significant difference in student test outcomes by testing center. Keywords: interior design, testing center, computer-based testing, human factors, environmental design, educational assessment, human-computer interaction, higher education EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 5 Effects of Interior Design on Computer-based Testing At Weber State University Higher education institutions continue to transition from paper-and-pencil tests to computer-based testing. Because of the costs associated with computerized testing, colleges and universities institute the testing-center model to centralize academic assessment (Thurlow, Lazarus, Albus, & Hodgson,2010). Shannon’s (1948) model of communication theory suggests that in any communication there is a sender, a mode of transmission, and a receiver. Shannon (1948) also suggests that noise of many types can interfere with message transmission. In the case of academic assessment, the physical environment of the computer-based testing center has the potential to be noise in the channel, distracting the student and therefore interfering with the communication between instructor and student. The question is, to what degree does the physical environment distract or affect learning outcomes in university testing centers. Research (Coyne & Bartram, 2006) addresses the equivalency of computer-based testing and paper-based testing. Research (Eley, 2006) also addresses the effects of environmental design factors on student learning in classrooms. However, there is a notable lack of research on environmental factors as they apply to human interaction with computers in a testing environment. The literature that surrounds traditional educational spaces can be applied to testing centers in a general way, but the dynamic interaction that takes place between student and instructor within the constraints of a testing room mediated by a computer is inherently different from an interaction that takes place in the classroom. The amount of time a student spends in the testing center is, in most cases, considerably less than the amount of time spent in a classroom. The amount of anxiety EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 6 a student brings into the testing center is, in most cases, considerably more than the anxiety the student brings to the classroom. This makes the testing center a truly unique learning environment, different from any other. Past studies can certainly inform interior design decisions, but without research targeted specifically to the case of academic testing centers, those who create testing centers – interior designers, educators, testing personnel, and university purchasing departments – are shooting in the dark in their efforts to construct effective testing centers. This study about how interior design factors affect student test outcomes in computer-based testing was conducted at Weber State University in Ogden, Utah. The purpose of this introductory study was to create a baseline of information about the current interior design of testing centers to inform future studies. This investigation was primarily informational in nature, examining an existing and vibrant computer-based testing organization at this institution that has been active for over a decade. REVIEW OF THE LITERATURE Computer-based Testing Transition from paper-and-pencil testing. As educators adopted new technology and switched from paper-and-pencil tests to computer-based testing, many studies were conducted to assess the effects of this new technology. Numerous researchers (Coyne & Bartram, 2006; Hwang, Tseng, & Hwang, 2008; Caudle, Bigness, Daniels, Gillmor-Kahn, & Knestrick, 2011; Anakwe, 2008) concluded that computer-based testing is equivalent to paper-pencil testing. There are exceptions to this finding of equivalency. For example, in one study (Clariana & Wallace, 2002), students performed better on computer-based tests. The students were given identical tests – either paper- or computer-based versions. The computer-based group outperformed the paper-based test group. Computer familiarity was not EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 7 related. However, in a study (Grignon, et al., 2009) of cognitive testing using a paper version and a computer version of the test, patients with schizophrenia were disproportionately impaired on the computer test. Some researchers (Caudle, Bigness, Daniels, Gillmor-Kahn, & Knestrick, 2011) suggest that computer-based testing in higher education will better prepare students for high-stakes exams, such as certification qualifications that are often administered on computer. A recent study (Dosch, 2012) of nurse anesthetists found that familiarity with computer-based testing translated into higher scores on the national certification exam. Concerns about Computer-based Testing. Though its use has become widespread, educators have expressed concern about student interaction to computer-based testing and note the lack of research on the subject. They worry that the graphical interface of computer-based systems may be difficult to understand or that computer screens may add glare to text. In one study (Caudle, Bigness, Daniels, Gillmor-Kahn, & Knestrick, 2011), students expressed annoyance at an onscreen clock in the corner of the testing screen. Strategies to overcome this irritation included sticky notes and stickers placed on the computer screen to block out the clock. Despite these concerns, students still strongly preferred computer-based tests, and researchers found that the use of computer-based testing did not alter reliability of the test. Computer-based tests may be considered equivalent as a whole to paper-based testing, but individual students may perform better in one mode or another. This is referred to as the test mode effect (Clariana & Wallace, 2002). Researchers (Dosch, 2012) have looked into many possible factors to account for this, among them are demographic profiles, test anxiety, computer familiarity, and interface items such as legibility, font, scrolling, line length, etc. A 2003 study (Bridgeman, Lennon, & Jackenthal, 2003) reported on the effect of various screen sizes and screen resolutions on verbal and math scores for college-bound high EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 8 school juniors. Ninety percent of the students stated they were comfortable using computers. Students scored higher on the verbal test by a quarter of a standard deviation with the use of a larger, high-resolution display. The majority of participants identified the need to continually scroll while reading on the lower resolution screen as something that interfered with their test taking. Designing for Human-computer Interaction Human-Computer Interaction (HCI) is the study of how people use computers and the practice of planning and designing interactions with computers. HCI is the point of intersection where computer science, behavioral sciences, and design, intersect with other fields, such as psychology, cognitive science, ergonomics, and information science (Carroll, 2013). Interface design. Initially, the field of HCI focused on optimizing the interface between people and machines – a mainly scientific and engineering endeavor. But HCI is now an academic field with nearly all technology companies implementing user-centered HCI design processes (Sellen, Rogers, Harper, & Rodden, 2009). HCI design as a profession has grown over the last 20 years. HCI design practitioners apply an understanding of the social context where these interactions take place. They approach software development as a process for creating spaces for human communication (Faiola & Matei, 2010), taking into account user’s personality, motivation, emotions, and moods (Tan, 2011). User experience. User experience (UX) is the area of HCI that focuses on the overall interaction or experience a user has with a system. UX takes into account not just the software interface, but also the context in which the user will interact with the system. This context includes the physical space and the emotional state of the user. This emotional state can be important in a testing environment. Recent findings (Katz, 2010) indicate that decision-making is not just a cognitive process but is also an affective EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 9 state. Negative affect can make it harder to do otherwise easy tasks, while positive affect can make difficult things easier There is also a link between affect and learning. When students are confronted with information that does not fit into their current knowledge base, students with a positive affect were better able to assimilate the information (Sottilare & Proctor, 2012). To assess a computer interaction system, UX designers typically follow a four-stage process (Sellen, Rogers, Harper, & Rodden, 2009) where they study, design, build, and then evaluate it with their users. They focus on the what, why, and how of improving user experience. The goal is to reach the point where the target user group judges the system as valuable and enjoyable Factors that users report as issues when dealing with technology are commonly human interaction items such as the annoying onscreen clock that users put sticky notes over to hide or the frustration students felt in needing to scroll excessively in order to read the text. But there is a lack of information on designing human interaction environments that integrate personal human factors as well as social and organizational phenomena (Faiola & Matei, 2010). Because of this, researchers (Sellen, Rogers, Harper, & Rodden, 2009) have added a fifth stage to the beginning of the process they call understand. Understand is a concept aimed at identifying the human values a technology is being designed to serve. Identifying these values requires communication with all the stakeholders – the users and the software engineers – and collaboration across disciples such as psychology, communication, art, cultural studies, architecture and interior design. Interior Design The task of arranging the physical interior environment of a space, such as a testing center where computer-based testing will take place, typically falls to an interior EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 10 designer. Interior designers who are educated at CIDA (Council for Interior Design Accreditation, 2011) accredited colleges and universities are expected to interpret research and apply theories of human behavior to solve complex design problems. The design process as described by the NCIDQ (National Council for Interior Design Qualification, 2004) begins with research on the client’s goals and the needs of the occupants of the space. This is followed by a space plan and concept studies integrating this research with the principles of design and theories of human behavior as well as complying with appropriate building, fire, and safety codes. Interior Design Research in Educational Spaces. As there is limited formal research on occupant needs for a computer-based testing center, designers for these spaces must rely on research gathered about similar environments. Most of the research (Stone, 2008) with regard to physical educational environments has been on classrooms, more specifically on the use of multimedia in classrooms. One study (Brooks, 2011) sought to find a relationship between formal learning spaces and learning outcomes. The researcher found that technologically enhanced learning spaces have a significant positive impact, independent of other factors. Brooks states that his results “demonstrate clearly that the formal physical environment in which students take their courses has a significant impact on measureable student learning outcomes” (p. 719). Researchers (Stone, 2008) are calling for more research in designing educational spaces. Given the potential detrimental effects of improper lighting, noise, temperature, and physical comfort (e.g., seating), Stone suggests that human factors specialists could contribute more to this area. Some studies have been conducted on human factors in education such as seating and daylighting. Seating. When chairs are uncomfortable or not properly sized for the occupant, this can be a distraction for students. Stone (Stone, 2008) found when students sat in chairs that were significantly too small they experienced spinal pain, but when students had EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 11 adjustable furniture and were trained on its use, they had better standing and sitting postures, less neck and shoulder pain, and less muscle tension. Dunn (1991) commented that school furniture is actually working against students and instructors: In fact, almost all school furniture seems designed to prevent students from concentrating and learning. For example, do teachers know that, when a person is seated on a hard chair, approximately 75 percent of that person's total body weight is resting on four square inches of bone? The resulting stress on the lower back often causes fatigue, discomfort, and the need for frequent movement (Dunn, 1991). Lighting. Numerous studies (Eley, 2006; Figueiro, et al., 2011) have confirmed a link between daylighting and educational outcomes. Eley (2006) linked quality daylighting to increased test scores in a sample of 26,000 students in California. Eley (2006) cited a Swedish study that showed that a lack of daylight negatively impacted students’ hormone patterns, which could influence students’ ability to concentrate. Wu & Ng (2003) showed children learned faster and performed better on standardized tests with good daylighting, and showed a 20% higher scholastic achievement in students with “good” daylighted schools over “poor” daylighted schools. Other studies (Figueiro, et al., 2011) are leading to an understanding of how light affects the regulation of circadian patterns. Evidence suggests that health and well-being are impacted by the circadian system and this in turn can impact productivity. There is consensus that a link exists between high performance and daylight, but defining that link has been a challenge. What constitutes good daylight or bad daylight? Do students need to be directly in the light or near it? The physiological and psychological benefits of daylighting have been shown in studies for different types of buildings including schools. However, IESNA (Illuminating Engineering Society of North America), the primary lighting authority in North America, only specifies daylight as a percentage per square foot. There is no consideration for number of room occupants EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 12 prescribed. Guidelines ignore the temporal quality of daylight, the way it changes and moves throughout the day. (Wang & Boubekri, 2011). An experiment (Wang & Boubekri, 2011) was performed to ascertain where daylight should be oriented when participants were seated in a room to work on a task. Several chairs were placed around the room. As part of the experiment, some students were allowed to choose where they sat; giving them a sense of control over how the daylight was oriented. Researchers found that the mood of subjects decreased when the subjects did not have room control or when subjects did not have a sense of privacy. Researchers concluded that the optimal zone for performance was one that “is located in close proximity to a sun patch and has a sense of privacy and control.” People were drawn to the sunlight and the view out the window, but control and privacy were equally important. The Principles of Design. Design theory (Nielson & Taylor, 2002) is formed by the principles of design. The principles are the cornerstone of what is considered good design and evaluation of an interior space begins with an assessment of these principles. The principles of design are (Nielson & Taylor, 2002): Emphasis – Sometimes called focal point, emphasis is a principle of design that indicates attention is given to a particular area in an interior space. A room or space will normally have one primary area of emphasis. Rhythm – A principle of design seen as a visual pattern or recurrence of a design element that draws the eye on a path. Types of rhythm include alternation, repetition, gradation and contrast. Scale – A principle of design that evaluates the relative size and visual perception or weight of objects. Scale is usually noted in terms of small or light, medium, large or heavy, and extra large. Proportion – A principle of design describing the desired relationship of parts to the whole in terms of size, ornamentation, or detail. An example of pleasing EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 13 proportion is the golden section used in art and architecture since the time of the ancient Egyptians, based on a ratio of approximately 2:3. Harmony – A principle of design denoting the combination of parts into a pleasing whole. Harmony is the result of unity and variety balanced in such a way as to create an orderly and pleasing combination. Balance – A principle of design describing visual equilibrium of objects in a space such as furniture, art, windows, and doors. Balance is usually described as symmetrical, asymmetrical, or radial. The principles of design are created by the application of the elements of design – space, shape, form, mass, line, texture, pattern, light, and color – which are the tangible items used in an interior environment. For example, horizontal lines tend to create more relaxed and peaceful environments. Vertical lines communicate formality while angular lines connote action and curved lines communicate fun and happiness (Nielson & Taylor, 2002). The elements of design are used to create the desired mix of these emotional cues to support the principles of design. The application of the elements transforms the design theory into practice. An interior design can be judged by an examination of the use and placement of elements with regard to the principles of design (Nielson & Taylor, 2002) Theories of Human Behavior. Studies (Katz, 2010) show there are significant correlations between perceived aesthetics and the perception of usefulness. Aesthetics can influence user adoption and affect user satisfaction. Designing positive experiences is important because the emotional systems affect the way the cognitive system works. Emotions affect the way the mind solves problems and aesthetics can alter our emotional state (Norman, 2004). Therefore, if the design of a testing center affects the emotions of the user – a student taking a test – then the design can affect the student’s ability to solve problems and so the design can affect the test score. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 14 One goal of interaction design is to give a voice to the end users with regard to the design and development of computer-related work. Communication theories that are commonly applied to computer interaction design include situation action, activity theory, and contributed cognition (Bodker & Sundblad, 2008). Knowing what users think is important to understanding how they interact with systems that are designed for them. Holistic, user-centered approach to HCI design incorporates interaction design theory, responsive design approach, and design stance (Bodker & Sundblad, 2008). The design stance acknowledges users as sophisticated enough to understand that the designed systems they are using are, in fact, designed. Users take this knowledge with them and use it to interact with the system. For example, if a user cannot locate a feature on a computer application, they might think, “Where would they have put that …” Designers can use this reasoning to predict behavior and anticipate how users may act. This awareness of themselves as users and acknowledgment of the designer as an entity then creates a relationship between the two. The students in a testing center have a relationship with the proctors, the instructors, the computer programmers, and the university as a whole. People have beliefs about other people’s intentions. If the system, in this case the testing center, consistently operates the way it should, then this is a positive relationship. The users’ evaluation of the value and aesthetics of the system may be influenced by this perception (Crilly, 2011). A responsive design approach would embrace the education ideology, practice theory, which describes the interaction between learner and environment, and link this to the concept of responsive commissioning, a research approach that explores the nature of the interaction between the social and physical aspects of the learning environment. The designer can then create an environment that is more responsive to the needs of 21st century education (Lippman, 2010). Chi Tester At Weber State University, student-computer interaction takes place every day at a network of testing centers that administer computer-based tests. Chi Tester is a EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 15 proprietary software application designed and used at WSU for over 10 years. A Web-based testing platform for delivering secure academic tests, Chi Tester delivers approximately 200,000 tests a semester. About 100,000 of those are securely delivered through approximately 20 campus testing centers and through remote proctors to students all over the world. About 25,000 tests are delivered through an integrated Scantron system (Fernandez, 2012). Campus testing centers are spread throughout Weber State University’s campuses and centers. Seven sites constitute the main testing centers. The remaining testing sites are administered by individual colleges and departments and are primarily for specialized use. In order to administer a proctored exam, instructors set up the exam in Chi Tester, designate which testing centers they want to allow students to use for test taking, and set the dates of availability. Students then either make an appointment through Chi Tester to take the exam, or go to the testing center at will. The proctors use the Chi Tester check-in interface to confirm the student, verify the test, and to assign a computer station. The results of the exam are immediately available to the instructors and if review is allowed for feedback, the results are also made immediately available to the students while they are still in the testing center. As a starting point of investigation into how environmental design factors affect student outcomes in the testing environment, the following research questions were posed: RQ1: What are student perceptions of the interior design of testing centers at Weber State University? RQ2: How are the testing centers designed with regard to standard principles in interior design? RQ3: Is there a statistically significant difference in test scores among the testing centers? EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 16 METHOD This study was approved by the Weber State University Institutional Review Board. The study endeavored to be as invisible as possible to students, faculty, and staff. This approach was taken in order to 1) control bias as much as possible, and 2) avoid interfering with student achievement as much as possible. RQ1 – Student Perception In order to address the first research question, “What are student perceptions of the interior design of the testing centers?” a five-question survey was administered between March 10 and March 23, 2013, to students immediately upon completion of their exams in the testing centers. A customer satisfaction survey, which already existed in Chi Tester to inquire about technical quality and computer-related issues, was modified to accommodate the study. The modified survey appeared on the left side of the computer screen, the same way that the original survey was delivered, after students had finished their exam. Because the original survey was delivered to all students at all testing centers, the one modified for this study was also delivered to all students at all testing centers. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 17 The survey was constructed with five questions to assess the participants’ perception of the testing environment. The first survey question was left the same as the original survey to ask if there were any computer or technical issues during the exam. If “yes” was selected, a text box appeared to allow students to comment on what difficulties they experienced. The second question asked participants to rate on a scale from one through five their impression of the interior design of the testing room, with one designated as “poorly designed” and five designated as “well designed.” The third question asked them to rate on a scale from one through five how much the design of the room impacted their test performance, with one designated as “no impact” and five designated as “high impact”. The fourth question was open-ended, asking them to describe aspects of room design that impacted their performance on the test. The fifth question remained the same as the original Chi Tester quality assurance survey and provided students with a space to write any additional comments they had about their testing experience. A screen shot of the modified survey is displayed above. RQ2 – Testing Center Design To address the second question, “How are the testing centers designed with regard to standard principles in interior design?” an observational interior design analysis of the physical environment of the testing centers was performed on February 18, 2013. The four main campus testing center sites and one departmental site were analyzed according to standard principles in interior design. Each site was visited outside of business hours for the purpose of the analysis. Approximately 30 minutes was spent at each center. For each site, the following items were assessed: 1. Adherence to core principles of design – Are the principles of design present and appropriately applied: emphasis, rhythm, scale, harmony, proportion, and balance? 2. Lighting – Is there daylight? Is the type and quantity of light appropriate? EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 18 3. Furnishings – Are the furnishings appropriate to the task? Do they allow adjustment for height, size, and for persons with disabilities? 4. Space-planning – Are the traffic patterns and flow smooth and easy to navigate? Is there good use of space? 5. Way-finding – Are the locations of entrances, exits, and the order of processing in and out clear? 6. Code compliance and other items particular to that site – Are there any code violations? Are there any unique design elements? Photographs were taken of each testing center. The rooms were measured, the seats were counted, and the square footage per person calculated. The university fire marshal accompanied the researcher to allow access and to report on code compliance. An interior design analysis was compiled for each site. Each item on the assessment list was given a numerical rating between one and five, with five indicating excellent and one indicating poor or absent as a means to classify the conformity of the design with regard to standard principles and application of design. RQ3 – Test Scores To address the third question, “Is there any statistically significant difference in test scores among the testing centers at Weber State?” a statistical analysis was performed on a random sample of 1,000 test scores pulled from the 26,465 exams delivered from January 14, 2013 to February 15, 2013. Student ID number, test ID, testing site ID, total points possible, total points received, time the test was started, and test duration were pulled from Chi Tester. SPSS was used to select a random sample from the tests delivered at the testing centers and then student demographic information – sex, age, major, and college – was pulled from the student information system for each subject included in the random sample. Student identities were kept confidential and all student data remained on Weber State EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 19 Figure 1 – Reponses to Question 2, Perception of Design, by site University systems. A one-way between subjects ANOVA was conducted to compare the effect of testing center on test scores. RESULTS RQ1 – Student Perception Survey Surveys were presented to 27,726 students from March 10, 2013 to March 23, 2013. Of these, 939 subjects responded to the survey. Seventeen responses were from Marriott Allied Health Sciences (MH 111), 137 were from Student Services (SC 262), 18 were from Science Lab (SL 228), 58 were from Social Sciences (SS 38), and 157 were from Student Union (SU 323). The remaining 552 responses were from other departmental or off-campus sites. Perception of design. Of all participants, 85% rated their testing center a three or above with regard to their perception of the design, indicating that most students perceive the centers to be well designed. Over 70% of participants rated the MH111 center and the SU323 center a four or five, compared to 66% rating the SC268 a four or five, 58% rating the SS38 a four or five, and 44% rating the SL228 a four or five. Results are summarized in Table 1 and Figure 1. Question 2 – Perception of design Student Response MH111 SL 228 SC268 SS38 SU323 7.14% 16.67% 2.29% 3.45% 2.60% 2 14.29% 11.11% 4.58% 6.90% 3.25% 3 7.14% 27.78% 25.95% 31.03% 24.03% 4 14.29% 22.22% 36.64% 25.86% 28.57% 5 57.14% 22.22% 30.53% 32.76% 41.56% Total 100% 100% 100% 100% 100% Table 1 – Percent responses for Question 2 by site EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 20 Table 2 – Percent responses for Question 2 by site Figure 2 – Reponses to Question 3, Impact of Design, by site Table 3 – Comments on Impact of Design Design impact on testing. Of all participants, 70% rated the impact of design on their test a three or below, indicating that most students perceived the design to have medium to low impact. Participants rated the design as high impact most often at MH111 with 29% rating this a five followed by SL268 with 22% rating a five and SS38 with 19% rating impact a five. Less than 10% rated impact of design a five at SC268 and SU323. Results are summarized in Table 2 and Figure 2. Comments on impact of design. On question four, “Please describe the aspects of the design that impacted your test”, the responses were coded as positive, negative, or neutral. Participants were divided in the percentage of comments that were positive or negative about the design elements impacting their exam. Notable differences appeared regarding SS38 where 50% reported negatively while 35% reported positively and regarding MH111 where 22% reported negatively while 67% reported positively. The percentage of comments in each category is summarized in Table 3. Participant comments included observations about the following items: lighting, seating, how closely their stations were situated, close Question 3 – Design impact on test Student Response MH111 SL 268 SC268 SS38 SU323 1 7.14% 22.22% 34.35% 24.14% 25.97% 2 28.57% 22.22% 15.27% 13.79% 15.58% 3 28.57% 33.33% 25.95% 25.86% 31.82% 4 0.00% 0.00% 10.69% 13.79% 8.44% 5 28.57% 22.22% 9.92% 18.97% 9.74% Total 100% 100% 100% 100% 100% Question 4 – Impact of Design MH111 SL 228 SC262 SS38 SU323 Positive 67% 45% 38% 35% 47% Negative 22% 45% 45% 50% 37% Neutral 10% 10% 17% 15% 16% EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 21 contact with other participants, noise, temperature, computer keyboards, computer monitors, wall color, walls being bare or neutral, cleanliness, smell, and about people coming and going in the testing room. An example of a positive comment from SU 323 was, “It’s very quiet and cool. There isn’t a lot to distract you, which is nice.” An example of a negative comment from SC262 was, “I’m sitting literally inches from the people beside me. NOT ENOUGH SPACE!” An example of a neutral comment from SS 38 was, “The brick wall.” Proximity. Participants’ comments most often involved factors that related to space planning and in particular how closely they sat to other test takers. The most common complaint was that participants felt they were too close to someone else, such as the comments, “I feel really close to my neighbor; it is sometimes hard to concentrate,” “The guy next to me kept elbowing me,” and “I feel that the stations are very close together. I feel a bit claustrophobic while taking my tests here.” Some expressed a strong emotional response by making their comments in all capital letters, such as “I'm sitting literally inches from the people beside me. NOT ENOUGH SPACE!” This closeness caused them to feel distracted by the noise, movement, and body odors of others. There were also positive comments about space planning that reinforce the indication that space planning and proximity to other test takers is important. One participant commented, “The room is open and uncluttered, and allows air to flow.” Other comments included, “I like the spacing of seats” “plenty of space” and “...placement that doesn't induce claustrophobia” which indicates that they perceive having adequate personal space as a desirable design element. Temperature. Participants’ comments indicate that the temperature is also an important factor affecting their test taking. Most complaints were about the temperature being too high. The room was described as “sweaty” and “stuffy.” One participant commented, “It was a bit too warm; it made it difficult to focus and think clearly.” Another one EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 22 commented, “The heat in this room is set to very high. I was sweaty and uncomfortable for most of the exam, that I was already anxious about taking.” Positive comments about temperature included, “Good temperature,” “For me, it felt like it was a warm, comfortable, well ventilated place,” and “It's very quiet and cool. There isn't a lot to distract you, which is nice.” Others expressed frustration that the temperature was unpredictable, “Some rooms are too hot or too cold. Sometimes I wish I brought a sweater, others times, I regret wearing one.” “The temperature in the room kept changing. From too warm to cool. I do better when it is cool.” Quiet. Several participants commented on the quiet, or the lack of quiet as being a factor that impacted their exams. Complaints included, “There were a lot of noises, especially a continual thud from outside the testing room, while I was taking this test,” “This testing center is much too frequently very noisy - whatever rooms are behind the doors often have people in them talking and laughing really loud,” and “Keyboards are so loud when typed on. I concentrate more on typing quietly than what I am typing.” The large number of positive comments about the room being quiet also suggests that this is an important factor. Comments included, “Quiet atmosphere,” “The noise level is good,” “It is quiet which helps,” and “Good quiet environment. Easy to take. Keep up the good work!” Lighting. Some participants commented about the lighting. Perceptions were mixed. Complaints indicated that they felt downward pointing fluorescent lighting gave “headaches,” caused “eye strain,” and is “stressful.” One participant noted “pointing them upwards and lighting the room by their reflection is a calming light.” Participants also indicated that lower lighting might be too relaxing, commenting that “...for some people a low stress environment creates a poorer performance” and that “On days when I’m tired ... (lower lighting) helps me be that much more sleepy ... a little bit EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 23 brighter would help.” Participants indicated that proximity to windows “helps a lot” and that they liked to see “nature outside the window” but thought the daylight coming through was a bit too bright. Seating. While some participants commented that the seating was “well designed” and “comfortable,” others indicated that poor seating had a high negative impact on their tests because the chairs were “not easy to sit on and be up to the computer” or that the “computer stations don’t allow for enough leg room.” One participant complained of pain due to inappropriate chair height. “The chair is too tall for the desk, making it uncomfortable to sit and type. My back began aching because of this after ten minutes of my hour-long test.” Walls. Participants seemed to prefer facing a wall while testing; they did not like to be facing another person. One participant commented, “I don’t like to sit across from other people where I can see their faces when I take my test. It’s just weird.” While they did not want to have much on the walls to distract them, “There is nothing on the walls, which is a good thing because I can be easily distracted,” they also had strong negative perceptions that bare walls are “blank, boring and stressful.” “The walls were bare and foreboding.” Having color on the walls was perceived as a positive design element with comments such as “I liked the colors; it’s a calming feeling,” “I like the purple wall!!!” and liking the “warm colors” and “two-tone paint” scheme. Computers. There were only a few comments about the computers. A couple of participants commented that they did not like the keyboards and a couple noted that they did like the large monitors. One participant was frustrated that the monitor was too bright but “didn't want to waste my testing time trying to figure out how to reduce it.” Just Fine. Several participants commented that they felt the design had no impact on their test or that the design was fine. Comments included such statements as “No impact, simple easy to use,” “Not distracting,” “nothin’ to say yo,” and “None, my blatant EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 24 lack of study impacted my test the most.” This indicates that the current design of the testing centers is adequate for these students and is meeting their needs. One participant commented, “I feel the environment had little to no impact on my test. The environment is very neutral and conducive to testing, in my humble opinion.” Another student offered this comment, “Thanks for asking this, I feel like my testing center is comfortable and that is important to me.” Following are comments by participants broken down by testing center: MH111. Participants did not like: facing other students fluorescent lights which give them headaches and eye strain Participants liked: it was nice clean quiet computers were large it was well lit SL228. Participants did not like: it was cluttered inconsistent the walls were bland noises coming from outside the room Participants liked: the way computers faced the quiet the peaceful scenery photos SS38. Participants did not like: seeing other people’s faces sitting across from them EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 25 computer monitors that were too bright sitting too close to their neighbor chairs that were too tall for the desk or that did not allow them to sit up to the computer lights that were not bright enough people next to them eating candy people sitting next to them taking the same test that the room was cold Participants liked: the availability of headphones the quiet spacing of the seats calming colors ease of getting in and out the room was clean it was neutral so as not to distract SC268. Participants did not like: room felt cramped and stuffy lack of leg room limited desk space hard lighting computers too close together room was too hot they weren’t facing a window it smelled bad was crowded sweaty noise machine Participants liked: it was warm comfortable well-ventilated good lighting dark purple wall they faced a wall EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 26 it was quiet no distractions warm colors were relaxing there were two wall colors the walls were bare there was good seating SU323. Participants did not like: the room was blank the room was boring keyboards were terrible could hear noise outside being next to doors seeing people coming and going seeing test proctors moving around the room was institutional keyboards were loud temperature was warm person in adjacent seat was elbowing it was sweaty a noisy computer fluctuating temperatures bare walls too much light from windows Participants liked: they could see the clock the temperature was comfortable colors were calming, chairs were comfortable it was quiet room was neutral it was cool calm setting lighting was relaxing good air flow sound dampening open space planning bare walls soft colors EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 27 Table 4 – Number of Additional Comments chairs that were adjustable color of the room it was open and uncluttered the walls were textured you could see nature out the window. Additional Comments. On question five, “If you have any additional comments about your testing experience, please provide them below” the responses were coded as positive, negative, or neutral with regard to the design of the center. There were a total of 36 responses to this question. Four comments were positive towards the design of the centers; 17 comments were negative; 15 comments were not about the design but were about other items of concern. The number of comments is summarized in Table 4. Comments made by participants for both question four and question five are included in Appendix A. RQ2 – Testing Center Design Analysis The testing centers were visited on February 18, 2013, while the campus was closed for Presidents’ Day. The fire marshal noted no code violations in any of the centers. Overall, the centers were rated a three or above on all design criteria analyzed. The SU323 center received the highest ratings with all criteria being rated a five with exception of furnishings, which received a four, for a combined score of 29. This center had the second largest amount of space allowed per student at 23 ft2. The other centers ranged from a combined score of 21 in east room of SC262 to a 26 for MH111. Most centers allowed between 15-19 ft2 of space per student. Question 5 – Other Comments MH111 SL 228 SC268 SS38 SU323 Positive 0 0 1 0 3 Negative 1 0 7 4 5 Neutral 0 4 2 2 7 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 28 Table 5 below summarizes the ratings given to each center. Supporting comments and photographic images are included in Appendix B. Table 5 – Results of Design Analysis by Testing Center Design Analysis MH111 SL228 SS 38* SC 262* SU 323 Adherence to core principles of design 5 3 3/3 4/3 5 Lighting 4 3 3/3 5/4 5 Furnishings 4 4 3/3 4/3 4 Space-planning 4 4 4/4 3/3 5 Way-finding 4 4 4/4 5/4 5 Code compliance & other factors 5 4 5/5 4/4 5 Density –square feet/person 17.0 ft2 14.9 ft2 18.2 / 28.8 18.9 / 17.7 23.0 ft2 *This testing center has two testing rooms; each room is rated separately. West room is rated first / east room is rated second. RQ3 – Test Scores Statistical Analysis An ANOVA by testing site was performed in a random sample of 1,000 test scores to determine if there was a relationship between mean test scores and which testing center was used for testing. Seven centers were tested: Marriott Health (MH111), Science Lab (SL228), Student Services (SC262), Social Science (SS38), Union (UB323), West Center (W114), and Davis Campus (D2215). There was no significant effect on test scores by testing center site, F(6, 993)=1.05, p=.395. This indicates that the testing center used does not significantly impact student test scores. DISCUSSION RQ1. What are student perceptions of the interior design of the testing centers? The results of the survey indicate that students overall have a positive perception of the design of the testing centers. The majority of all participants in all centers, more than EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 29 70%, rated the testing center design at a three or above, as indicated in Figure 1. The MH111 testing center received the highest percentage of participants, 57%, selecting a rating of five, and the highest ratio of positive to negative comments, 6/2, indicating that participants have a perception that this center is well designed. Students were widely divided with regard to their perception of the design’s impact on their test taking as indicated in Figure 2, with the majority rating the impact at a three or below. The overall ratio of positive to negative comments was 72/75. The participants’ comments are enlightening as to the interior design factors that they do perceive affect their performance. One participant commented, “When I walked into the test, it affected my testing attitude.” The comments that participants made indicate that some feel strongly about interior design elements and the impact these elements make on their testing experience. The comments typed out in all capitals complaining that there was not enough space, the repeated exclamation points, and terms such as “claustrophobic” indicate an emotional response to the elements that distract. The positive comments too, indicate emotional response such as the “calming” colors and the “peaceful” scenery that participants noted. It is also of interest how consistent the responses were. Comment after comment the same issues came up, as both positive and negative, in one form or another. The items of proximity to other test takers, quiet, walls, chairs, temperature, and visual privacy were noted most often. RQ2: How are the testing centers designed with regard to standard principles of design? The design analysis concurred with the perception of participants, with all the centers receiving a three or above for all the rating criteria. The design of the testing centers did not differ significantly. Each center has areas that could be improved. The design EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 30 analysis rated SL228 as slightly lower than the other centers, and it also received the largest percentage of participants, 17%, selecting a rating of one, or poor design. It is only fair to point out that that the building which houses SL228 has been officially slated for demolition for some time and efforts to construct a new building are in progress. The SC268 received the highest ratio of negative to positive comments, 32/27, with the majority of the negative comments being on the subject of temperature and overcrowding. One notable area of design needing improvement was in the area of space planning. As noted by the students, some areas have computers more closely together than is desirable. Adequate space should be allowed to prevent against “elbowing” as one participant called it and to allow test takers to pull out seats, sit, stand, enter and exit the areas without contact with other test takers. Some participants noted facing other participants as a distraction. Although facing a wall eliminated this distraction, participants were mixed in their reactions to facing a wall. One possible solution to this distraction would be partitions that block the view of other test takers and lend a sense of visual privacy as well as personal space. The walls of most centers could also use attention. Although the participants noted a desire for clean walls without distraction, they also commented on bare walls being a distraction or too “institutional” in nature. The design analysis noted this as well. One possible fix for would be to create visual relief by way of sound absorbing panels. Wall panels would not only assist in noise reduction, but could be placed to create rhythm and emphasis on the walls and give visual relief without distraction. Several participants noted the lighting. Although the accent lighting in SC268 was perceived positively in the design analysis for the rhythm it created on the walls, one student commented on this light overhead shining directly on the computer created EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 31 glare. Accent lighting aimed toward the wall that bounces back into the room might be one way to alleviate this particular issue. One participant noted the view of nature out the window. A few others noted the windows without commenting on a view, but making note of the quantity of light. This would indicate that perhaps natural shading from trees or plants would be the ideal way to incorporate windows and daylight in a testing center. If trees were planted outside the windows, then light would be muted and there would be a view, but not a distracting one. RQ3: Is there a difference in test scores among the testing centers? Test scores did not vary significantly by testing center. This would indicate that while there are differences in the design of the testing centers, the differences are not significant enough to impact test scores. This does not tell us whether space planning or daylight affects test scores. It only tells us that at the end of the day, test scores overall at Weber State University are the same at all the testing centers. Future Study The next step would be finding out what specific factors affect test scores. Just how crowded do students need to be before their test scores suffer? Does allowing generous space between test takers improve test scores? Future studies in controlled environments would inform these questions. Ideally, a research testing facility could control for elements noted as effects to computer-based testing in this study, such as lighting, proximity, temperature and assist in creating criteria that higher education institutions could use when building new centers. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 32 Limitations This study was approached with the assumption that helping students feel comfortable and helping them to score higher is the end goal. Maybe that is not the only consideration. Testing chairs face in a direction to make proctoring easier or to put as many people into a space as possible. A future study that includes a survey of the testing center personnel as well as faculty would better answer these questions. The statistical analysis of the test scores did not do much to inform the study. Students self-selected which testing center they wished to use and so there are too many variables to get any clear understanding of the effect of the design on test scores. A controlled study of students taking the same course but randomly assigned to different centers would better answer this question. Another limitation of the study was the design analysis. The instrument was somewhat subjective. Also, although the researcher has a background in interior design, it would be more appropriate to have a team of designers evaluate the design of the testing centers. It would be interesting to see a future study with a team composed of interior design educators and interior design practitioners, who have expertise on the subject of educational spaces, evaluate the design of the testing centers. The most critical limitation of this study was the bias of the researcher, who is biased for several reasons. First of all, I was the principal designer of the Marriott Health testing center. After working there for five years, I proposed and designed the project that knocked down walls and re-created the area. It took almost a year to complete. I am very proud of the result. Second, as an undergrad, I took tests in three of the five testing centers. I have a negative bias toward the spaces that I felt especially stressful in and a positive bias to the one I preferred while I was testing for my classes. It is interesting to me how vividly I can remember how the rooms felt while taking critical exams. Third, I EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 33 like and admire the current administrators of the testing centers. I know and understand the space and funding limitations that restrict their design choices and I admire their ability to make the best possible testing spaces available to students despite limited funding, high demand for testing services, and limited physical space. They have my profound respect and I consider them friends. CONCLUSION The study was able to answer at least in part, two of the three research questions. Student comments gave valuable insight to answer RQ1 as to what student perceptions are of the testing centers. The design analysis informed an answer to RQ2 and documented how the current centers are designed. Having photographs of these spaces also allows the conversation to continue and further analysis based on the images can take place. The study did not adequately answer RQ3 and it is still not clear if test scores vary by testing center. The results of the analysis have created more questions about why the one set of scores was different and further study is necessary to answer this question. After examining the current testing center spaces, looking at test scores, and reading the student responses, an image of the perfect testing center begins to form. The center would be quiet. The lighting would be indirect. Daylight would trickle in but not shine directly on anyone – preferably nature would shade the windows from the outside. The walls would have variety created from architectural elements, from subtle hints of vibrant color, and from textures; but no pictures would hang on the walls. This center would have enough personal space for stretching out legs, for pushing back a chair, and for laying out papers. Personal space would be more than an elbow’s reach apart. The chairs would be adjustable and would face a wall or partition. The computer screens would be large and the keyboards would be quiet. The temperature would be consistently cool. An air filtration system would keep the room smelling fresh. Such a EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 34 perfect space would help clear distraction, perhaps removing just a little bit of the noise from the computer-mediated communication channel and enhancing the communication between instructor and student in the testing environment. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 35 REFERENCES Anakwe, B. (2008). Comparison of Student Performance in Paper-Based Versus Computer-Based Testing. Journal of Education for Business (September/October), 13-17. Bodker, S., & Sundblad, Y. (2008). Usability and interaction design - new challenges for the Scandinavian tradition. Behaviour & Information Technology, 27(4), 293-300. Bridgeman, B., Lennon, M. L., & Jackenthal, A. (2003). Effects of Screen Size, Screen Resolution, and Display Rate on Computer-Based Test Performance. Applied Measurement in Education, 16(3), 191-205. Brooks, D. C. (2011). Space matters: The impact of formal learning environments on student learning. British Journal of Educational Technology, 42(5), 719-726. Carroll, John M. (2013). Human Computer Interaction - brief intro. In: Soegaard, Mads and Dam, Rikke Friis (eds.), The Encyclopedia of Human-Computer Interaction, 2nd Ed.. Aarhus, Denmark: The Interaction Design Foundation. Available online at http://www.interaction-design.org/encyclopedia/human_computer_interaction_hci.html Caudle, P., Bigness, J., Daniels, J., Gillmor-Kahn, M., & Knestrick, J. (2011). Implementing Computer-Based Testing in Distance Education for Advanced practice Nurses. Nursing Education Perspectives, 32(5), 328-333. CIDA. (2013, March). Council for Interior Design Accreditation. Retrieved from http://accredit-id.org/ EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 36 Clariana, R., & Wallace, P. (2002). Paper-based versus computer-based assessment: key factors associated with the test mode effect. British Journal of Educational Technology, 33(5), 593-602. Coyne, I., & Bartram, D. (2006). Design and Development of the ITC Guidelines on computer-Based and Internet-Delivered Testing. International Journal of Testing, 6(2), 133-142. Crilly, N. (2011). The Design Stance in User-System Interaction. Design Issues, 27(4), 16-29. Dosch, M. P. (2012). Practice in Computer-Based Testing Improves Scores on the National Certification Examination for Nurse Anesthetists. AANA Journal, 80(4), S60-S66. Dunn, R. (1991). Footloose and fancy-free: Kicking the habit of conventional classroom furniture. Clearning House, 64(6), 369. Eley, C. (2006). High Performance School Characteristics. ASHRAE Journal, May, 60-66. Faiola, A., & Matei, S. A. (2010). Enhancing human-computer interaction design education: teaching affordance design for emerging mobile devices. International Journal of Technology & Design Education(20), 239-254. Fernandez, L. (2012,November 23). Email interview. Figueiro, M., Brons, J., Plitnick, B., Donlan, B., Leslie, R., & Rea, M. (2011). Measuring circadian light and its impact on adolescents. Lighting Research Technology(43), 201-215. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 37 Grignon, s., Gregoire, C.-A., Durand, M., Mury, M., Elie, D., & Chianetta, J. M. (2009). Age-dependent discrepancies between computerized and paper cognitive testing in patients with schizophrenia. Social Psychiatry & Psychiatric Epidemiology(44), 73-77. Hwang, G.-J., Tseng, J. C., & Hwang, G.-H. (2008). Diagnosing student learning problems based on histroical assessment records. Innovations in Education and Teaching International, 45(1), 77-89. Katz, A. (2010). Aestetics, usefulness and performance in user-search-engine interaction. Journal of Applied Quantiative Methods, 5(3), 424-445. Lippman, P. C. (2010). Can the physical environment have an impact on the learning environment? CELE Exchange(13), 1-5. NCIDQ. (2013, March). National Council for Interior Design Qualification. Retrieved from http://www.ncidq.org/ Nielson, K. J., & Taylor, D. A. (2002). Interiors: An Introduction. New York: McGraw Hill. Norman, D.A. (2004). Emotional design: Why we love (or hate) everyday things. New-York: Basic Books. Sellen, A., Rogers, Y., Harper, R., & Rodden, T. (2009). Reflecting Human Values in the Digital Age. Communications of the ACM, 52(3), 58-66. Shannon, C. E. (1948). A Mathematical Theory of Communication. The Bell System Technical Journal, 27(July, October), 379-423,623-656. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 38 Sottilare, R. A., & Proctor, M. (2012). Passively classifying Student Mood and Performance within Intelligent Tutors. Educational Technology & Society, 15(2), 101-114. Stone, N. J. (2008). Human Factors and Education: Evolution and Contributions. Human Factors and Education, 50(3), 534-539. Tan, O.-S. (2011). Problem-based Learning Approach to Human Computer Interaction. World Academy of Science, Engineering and Technology(76), 462-465. Thurlow, M., Lazarus, S. S., Albus, D., & Hodgson, J. (2010). Computer-based testing: Practices and considerations (Synthesis Report 78). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Wang, N., & Boubekri, M. (2011). Design recommendations based on cognitive, mood and preference assessments in a sunlit workspace. Lighting Research & Technology(43), 55-72. Wu, W., & Ng, E. (2003). A review of the development of daylighting in schools. Lighting Research & Technology, 35(2), 111-125. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 39 APPENDIX A Participant comments to Questions 4 and 5 on the survey are listed below by testing center. Question 4: Please describe the aspects of the interior design that impacted your test. MH111 Facing another person while trying to take a test - especially a person in my class - is very distracting nice clean quiet large computers; too much glare in lightin;. fluorescent lights give headaches, eye strain. Quiet well lit scratch paper helped SL 228 Cluttered and inconsistant. I had a lot of problems with this test and I kept interrupting the test process for the other students I like the way the computers are facing. I like the way the room is set up. I feel it is better, it doesn't seem as tight. It is quiet which helps. And there are a couple peaceful scenery shots on the walls. I feel like it's really bland though...maybe put some encouraging quotes on the walls? More encouraging posters :) There were a lot of noises, especially a continual thud from outside the testing room, while I was taking this test. SS38 Availability of headphones Good quiet enviornment. Easy to take. Keep up the good work! I don't like to sit across from other people where I can see their faces when I take my test. It's just weird. I don't think it was the design so much as the brightness of the computer screen itself...it was WAY too bright for me, and I didn't want to waste my testing time trying to figure out how to reduce it. I don't think the layout of a room impacts my test taking. I feel really close to my neighbor, it is sometimes hard to concentrate. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 40 I like the spacing of seats. I liked that the essay questions were separated. I liked the colors its a calming feeling It is fine how it is. It was very easy to get in and out without feeling like I was impacting anyone's testing progress. The brick wall. The chair is too tall for the desk, making it uncomfortable to sit and type. My back began aching because of this after ten minutes of my hour long test. The chairs on not easy to sit on and be up to the computor, bad ergonomics. The lights could be a bit brighter (not blaring though). On days when I'm tired (which is usually when I take a test because I was up late studying), it helps me be that much more sleepy. It's not too bad, but a little bit brighter would help. The person next to me was eating candy and taking the same test as me. The room is cold. The room/environment was clean, ambient, neutral so as to not distract. SC262 A little stuffy and cramped. Computer stations dont allow for enough leg room. During long tests, legs begin to cramp when i cant move my chair back, which I could not due to other students testing at a desk behind me Controlled volume and temperature cramped room and table. Felt sort of boxed into the computer. Desk space. Need more room to spread out my testing materials downward florescent lights are stressful. Pointing them upwards and lighting the room by their reflection is a calming light. For me, it felt like it was a warm, comfortable, well ventelated place. Very good feel for testing as it lowered outside distraction. good light Hard lighting Having the computers so close to eachother can be a little distracting. I am always very tired when I test in this room. I don't know if it's the design, the air or the PC/testing colors, but I have a hard time focusing on the exams. I am not sitting uncomfortably close to those next to me and I can't get distracted because I am staring at a blank wall. :) i don't think any of the design impacted my test. I feel cramped during the test and somewhat distracted by the activities of others on either side of me. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 41 I feel that the stations are very close together. I feel a bit claustophobic while taking my tests here. I felt the computers were a bit close. Would have liked to have more personal space between me and the other test takers. I think the temperature in here bets a bit high. I wasn't facing a window. I'm sitting literally inches from the people beside me. NOT ENOUGH SPACE! It is EXTREMELY hot in here!! I'm dying of heat!! We need air conditioning. It 's fine. It was a bit too warm; it made it difficult to focus and think clearly. its a little close together but the noise level is good. and i like the purple wall!!! kind of crowded<br>made it hot and kind of uncomfortable. Lighting and temperature. They were fine and did not distract from test taking. No impact, simple easy to use None None, my blatant lack of study impacted my test the most. Not distracting. nothin' to say yo. Overall ambient, wall paint color, computer screen and keyboard privacy, sound proof Smelled bad. Not the rooms fault, but the person I sat next to. Somewhat crowded. More distractions that way. Sweaty in room! The back of my computer is to the wall, so im facing the wall the whole time. No distractions or temptations to focus on other things the close proximity to the noise machine the computer layouts. The computers are too close together. Sometimes is hard to fit all test materials in such small space. The fact that I am facing the wall helps me a ton in focusing. If I see other people moving I get distracted. I would also be distracted if the walls had several different colors.I like the fact that it is a solid, dark color (purple). The Quietness and lack of distractions during my test. the temperature too hot.. negative effect and lighting directly overhead shining on eyes negative effect The warm colors were relaxing There should be more room at the computer desk. there was a light above my computer that was too bright There was one problem (my Final problem) that contained three brackets [A*(BvC)]=(AvB)]. I was unclear as to where to place order due to missing or extra bracket. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 42 there's nothing on the walls which is a good thing because i can be easily distracted Two tone paint, 90% o the walls are light colors Well lit and well designed seating SU323 Being able to see the clock was nice. It was a good temperature to keep me focused and alert. It was a comfortable room with calming colors. That made it helpful. Blank, boring and almost stressful. color comfortable chairs. TERRIBLE keyboards! Easy to hear outside noise Good temperature, quiet, comfortable chair i didnt notice it. I dont like being next to the doors and seeing in the corner of my eyes people coming and going, also see the test proctors moving around while seated is distracting from this computer...its distracting...Computer 1 was annoying. I don't like the windows to the "lobby", it distracts me to see people out there. I don't think the room design affected my performance. I feel the environment had little to no impact on my test. The environment is very neutral and conducive to testing, in my humble opinion. I like how the colors aren't too dark or too light- it's more in the middle, and it helped me to concentrate. It feels like it is too institutional. It wasn't distracting at all. It's a very dull room. I understand it is meant to not distract the students but I just get instantly sleepy when I test in here. For the computers, I like the screens but the keyboards are so loud when typed on. I concentrate more on typing quietly than what I am typing. It's very quiet and cool. There isn't a lot to distract you, which is nice. Just having a calm setting with calm neutral colors that weren't distracting loud keyboards Lower lighting creates more relaxed environment, for some people a low stress environment creates a poorer performance. n/a none None None. Once again, neutral color scheme, good air flow, sound dampening, and placement that doesn't induce claustraphobia. plenty of space. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 43 Quiet atmosphere, neutral colors rather closed off, no distractions on the walls besides the clock which is good. Soft colors made me calm while I was testing. Comfortable chairs helped feel at ease. sound dampening, climate control, are very important because it makes for fewer distractions. These are adequate at this testing center. The chairs used are probably the most important, my comfort and ability to ajust height, distance from computer, sitting back/reclining are nice during long test. Thanks for asking this, I feel like my testing center is comfortable and that is important to me. temperature a little warmer than i would prefer for a test. the color of the room and lighting The guy next to me kept elbowing me. The heat in this room is set to very high. I was sweaty and uncomfortable for most of the exam, that I was already anxious about taking. I prefer a cooler environment as I am more comfortable being slightly cold than over heated. the noisey vibrating computer next to mine #11 The room almost feels too warm. The room is open an uncluttered, and allows air to flow. Neutral color scheme helps as well. The tempeture in the room kept changing. From too warm to cool. I do better when it is cool. The walls were bare and foreboding. They actually helped! Looking at the texture on the wall vs. the smooth wall, the focal point of nature outside the window, and effects of color and line on your mood! I always use this testing center because I feel it is the best designed of them all! They keyboard I used had a smaller backspace key than I was used to, and it gave me some difficulty since I just took a short-answer test. When I wallked into the test it affected my testing attitude Windows helps a lot, but havin them covered so it isnt so bright. Question 5: If you have any additional comments about your testing experience, please provide them below. MH111 This testing center is much too frequently very nosiy - whatever rooms are behind the doors often have poeple in them talking and laughing really loud. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 44 SL228 It would be really helpful if ChiTester would design Logic tests with graph paper-like answering spaces. It would be great if you could add the Logic symbols to your programming, too. Shouldn't be that difficult. It would be extremely useful if questions could be 'marked' so that you can come back and review them later. It would be nice if the peope working in the testing center know how to help you with a problem that arrises. SS38 The question about Shaquille & Walter did not give the best answer even though I chose the one we discussed "Walter is a late maturer." The following answer is the better answer "Walter, a virgin, does not have to worry about contracting a STD." The person sitting next to me smelled like smoke so bad that I couldn't concentrate. the room smells bad, lots of BO Some rooms are too hot or too cold. Sometimes I wish I brought a sweater, others times, I regret wearing one. no. I wish I could've pulled out my phone and written down one of the question's options, it was hilarious! The shelf under my table was very annoying. I hit my legs on it many times during my test. SC262 The design is fine just need to make sure that the temp stays good especially on finals week when it is beyond full because it getting too hot makes it really hard to concentrate Several computers in the testing room did not work. Spacing would be nice... It is hard to consintrate with what feels like other people looking at your computer screen.....Kinda feel cramped. The room is too hot. I find marketing and economic issues to be rather boring. More snacks. The temperature was much too high in the testing room. This made it difficult for me to concentrate on my test. Overall good. It is a little darker in here than I would normally like. This is probably due to the type of lighting you have. The air machine by the door was loud and a little distracting. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 45 SU323 the symbol for medicine is actually the staff of the greek messenger god and has two talking snakes coiled around it If the testing center was a poor facility, then I believe it would have a significant impact on my testing. As it is a nicer facility I don;t feel it had much impact at all; it was not a distraction. Cool the room down. Warmer temperatures allow for more nap times. Cool down the room. It is too warm and a good place to take a nap. I would like the blinds to be open -- I would be more at ease and would take more time on my test. Don't like closed-in feeling. The essay question boxes are larger than the width of the screen, so you have to scroll back and forth to see what you typed. Very annoying. Proctors were nice and helpful Everything is fine except this damn keyboard. Not that this was the testing center's fault, but I think there was a drum line rehearsal directly below us, because that's all anybody could hear for a good half hour. n/a none None. As ever, a thoroughly enjoyable experience! The Chi tester site NEEDS a help page. I take distance courses and had to fly from Seattle to SLC to take this exam due to repeated technical difficulties which could have been remedied if there was a simple FAQ section. Clean, orderly, and fast! Always a pleasure. I wasn't able to reserve a time when I went into the chi tester. EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 46 APPENDIX B Marriott Allied Health Sciences Building (MH 111) This testing center is designated MH 111 and is located on the first floor of the J. Willard Marriott Allied Health Sciences Building. Image 1 Front of Marriott Allied Health Sciences Building EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 47 Small sign on the right of double doors indicates that the testing center is inside and the hours of operation. Check-in counter in the back corner. Students walk through an open student computer lab in front to get to the testing area. Image 2 Entrance to MH 111 Image 3 First view from inside the door of MH 111 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 48 View of the testing room from the northwest corner of the room. View from the northeast corner of the room. Image 4 View of MH 111 testing room from NW Image 5 View of MH 111 testing room from NE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 49 View from the southwest corner of the testing room. . View from the southeast corner of the testing room. Image 6 View of MH 111 testing room from SW Image 7 View of MH 111 testing room from SE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 50 Seating in the testing room. Computers, keyboards and headphones used in the testing room. Image 8 Seating used in MH 111 Image 9 Computers used in MH 111 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 51 The check-in counter with instructions on the wall and hanging overhead. In the top right is the only daylight that enters the area from clerestory windows. Image 10 Check-in counter for MH 111 Image 11Daylight for MH 111 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 52 Light fixtures in the testing room. Storage for student materials inside the testing room. A second similar storage unit is located by the check-in counter. Image 12 Light fixtures for MH 111 Image 13 Student storage for MH 111 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 53 Marriott Allied Health Sciences Building - MH 111 Table 6 –Design Analysis for MH 111 Item Rating Comments Adherence to core principles of design – Are the principles of design present and appropriately applied: emphasis, rhythm, proportion, harmony, scale? 5 Computers in testing area are focal point, could be reinforced. Computer/seating unity creates harmony and rhythm. Scale and proportion are good. Colors relaxing. Lighting – Are there different types of lighting: ambient, task, and accent? Is there daylight? Is the quantity of light appropriate? 4 Parabolic fixtures are good for computer work, quantity is appropriate. Wall color helps offset cold fluorescent lighting. No direct daylight, but light near check-in counter; only ambient light, no task or accent lighting. Furnishings – Are the furnishings appropriate to the task? Do they allow adjustment for height, size? 4 Seating is appropriate and adjustable. Only one table height provided; does not accommodate tall or large individuals. Space-planning – Are the traffic patterns and flow smooth and easy to navigate? Is there good use of space? 4 Easy to navigate, good use of space, possible bottleneck at entrance with narrow area for both entrance and exit. Way-finding – Are the locations of entrances, exits, and the order of processing in and out clear? 4 Good signage at counter and on doors within testing area. Only small sign on outside door. Code compliance and other items particular to that site. 5 Complies with code, good use of color to create quiet mood. Density –square feet/person 17.0 ft2 560.0 ft2 / 33 seats EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 54 Science Lab (SL 228) This testing center is designated SL 228 and is located on the second floor of the Science Lab. Image 14 Side entrance to Science Lab EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 55 Signs on the wall and on the door indicate that the testing center is inside and the hours of operation. The check-in counter is behind an open student study area. Image 15 Entrance to SL 228 testing center Image 16 Check-in counter for SL 228 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 56 View of testing room from the door, northwest corner of the room. View of testing room from southwest corner. Image 17 View of SL 228 testing room from NW Image 18 View of SL 228 testing room from SW EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 57 View of testing room from southeast corner. View of testing room from northeast corner. Image 19 View of SL 228 testing room from SE Image 20 View of SL 228 testing room from NE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 58 The computers, keyboards, mice, and headphones in the testing room. One of two kinds of chairs for computer tests in the testing room. Image 21 Computers used in SL 228 testing room Image 22 Chair used in SL 228 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 59 Storage for student materials inside the testing room. Light fixtures in the testing room. Image 23 Student storage for SL 228 Image 24 Light fixtures for SL 228 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 60 Science Lab – SL 228 . Table 7 –Design Analysis for SL 228 Item Rating Comments Adherence to core principles of design – Are the principles of design present and appropriately applied: emphasis, rhythm, proportion, harmony, scale? 3 Questionable emphasis – is it the seating or the window for proctor viewing? Rhythm is created by wall coverings over chalkboards, but seems inappropriate. Good harmony with color and scale of furnishings. Lighting – Are there three types of lighting: ambient, task, and accent? Is there daylight? Is the quantity of light appropriate? 3 Parabolic fixtures good for computer work; no daylight; no task light; no accent light; quantity of light is appropriate. Dark wall coverings lend to darkness. Furnishings – Are the furnishings appropriate to the task? Do they allow adjustment for height, size? 4 Seating is appropriate and adjustable Space-planning – Are the traffic patterns and flow smooth and easy to navigate? Is there good use of space? 4 Easy to navigate; good use of space. Computers seem a little crowded and seating is close to paper test seating. Way-finding – Are the locations of entrances, exits, and the order of processing in and out clear? 4 Entrance and exits in testing room clearly marked, including doors that are not to be used for exits. Code compliance and other items particular to that site. 4 Complies with code, at maximum occupancy with alternate exits blocked. Density –square feet/person 14.9 ft2 776.3 ft2 / 52 seats EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 61 Social Sciences Building (SS 38) This testing center is designated SS 38 and is located in the basement of the Social Sciences Building. Image 25 Front of the Social Sciences Building EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 62 Sign to the right of the door indicates the testing center is inside and the hours of operation. Check-in counter. Image 26 Entrance to SS 38 testing center Image 27 Check-in counter for SL 38 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 63 View from the door of first testing room, southeast corner. View from northeast corner of first testing room. Image 28 View of SS 38 testing room from SE Image 29 View of SS 38 testing room from NE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 64 View from northwest corner of first testing room. View from southwest corner of first testing room. Image 30 View of SS 38 testing room from NW Image 31 View of SS 38 testing room from SW EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 65 Chair from first testing room. Chairs from second room were also stationary style, but with more flexibility in frame. Computers, monitor, keyboard, mouse, and headphones for testing. Image 32 Seating used in SS 38 testing room Image 33 Computer used in SS 38 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 66 View of second testing room from entrance, northwest corner. View of second testing room from southwest corner. Image 34 View of SS 38 second testing room from NW Image 35 View of SS 38 second testing room from SW EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 67 View of second testing room from southeast corner View of second testing room from northeast corner. Image 36 View of SS 38 second testing room from SE Image 37 View of SS 38 second testing room from NE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 68 Light fixtures in testing rooms. Storage for student materials next to check-in counter. Image 38 Lighting used in SS 38 testing room Image 39 Student storage for SS 38 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 69 Social Sciences Building - SS 38 Note: This testing center has two testing rooms; each room is rated separately. (West room is rated first / east room is rated second). Table 8 –Design Analysis for SS 38 Item Rating Comments Adherence to core principles of design – Are the principles of design present and appropriately applied: emphasis, rhythm, proportion, harmony, scale? 3/3 No emphasis – seats face a brick wall which has equal emphasis with viewing window; rhythm established by seating; neutral color scheme is harmonious; scale is appropriate. Lighting – Are there three types of lighting: ambient, task, and accent? Is there daylight? Is the quantity of light appropriate? 3/3 Fluorescent lighting is harsh, but adequate for ambient light; no daylight, basement location; accent light would help alleviate the sense of being in the basement. Furnishings – Are the furnishings appropriate to the task? Do they allow adjustment for height, size? 3/3 Furnishings are adequate; some chairs are not adjustable, several different types of seating. Space-planning – Are the traffic patterns and flow smooth and easy to navigate? Is there good use of space? 4/4 Paper seats have been arranged to allow for ample room to maneuver; seats facing the wall with back to doors is undesirable. Way-finding – Are the locations of entrances, exits, and the order of processing in and out clear? 4/4 Signs on doorways indicate entrance and exit; several doors to choose from which could cause confusion. Code compliance and other items particular to that site. 5/5 No code violations; egress is clear and accessible. Density –square feet/person 18.2 / 28.8 729 ft2 / 40 seats and 432 ft2 / 15 seats EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 70 Student Services Building (SC 262) This testing center is designated SC 262 and is located on the second floor of the Student Services Building. Image 40 Exterior View of Student Services Building EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 71 Entrance to testing center is through an open student study area. Check-in counter Image 41 Entrance to SC 262 testing center Image 42 Check-in counter for SC 262 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 72 View of first testing room near entrance northeast corner. View of first testing room from northwest corner. Image 43 View of SC 262 testing room from NE Image 44 View of SC 262 testing room from NW EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 73 View of first testing room from southwest corner. View of first testing room from southeast corner near exit. Image 45 View of SC 262 testing room from SW Image 46 View of SC 262 testing room from SE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 74 Computers, monitor, keyboard, and mouse used in first testing room Seating used in first testing room. Image 47 Computers used in SC 262 testing room Image 48 Seating used in SC 262 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 75 Overhead light fixture in first testing room. Accent/task lighting in first testing room. Image 49 General Lighting in SC 262 testing room Image 50 Accent lighting used in SC 262 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 76 Daylight into first testing room. Indirect daylight that passes through first testing room and check-in counter into second testing room. Image 51 Daylight in SC 262 first testing room Image 52 Daylight passing through to second SC 262 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 77 View of second testing room from southeast corner. View of second testing room from northeast corner. Image 53 View of second SC 262 testing room from SE Image 54 View of second SC 262 testing room from NE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 78 View of second testing room from northwest corner. Computers, monitors, keyboards, mouse, and headphones used in second testing room. Image 55 View of second SC 262 testing room from NW Image 56 Computer used in second SC 262 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 79 Second type of seating in second testing room, primarily the same seating as first testing room was used. Third type of chair used in second testing room. Image 57 Seating used for part of second SC 262 testing room Image 58 More seating used for second SC 262 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 80 Ceiling light fixtures in second testing room. Storage for student materials by check-in counter. Image 59 Lighting used in second SC 262 testing room Image 60 Student storage for SC 262 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 81 Student Services Building - SC 262 Note: This testing center has two testing rooms; each room is rated separately. (West room is rated first). Table 9 –Design Analysis for SC 262 Item Rating Comments Adherence to core principles of design – Are the principles of design present and appropriately applied: emphasis, rhythm, proportion, harmony, scale? 4/3 Emphasis is clearer in first room with paper seating being the focal point and the accent wall supporting this. Accent lighting in both rooms creates rhythm and balance. Too many seats disrupt harmony. Lighting – Are there three types of lighting: ambient, task, and accent? Is there daylight? Is the quantity of light appropriate? 5/4 Lighting is excellent in the first room. Main light fixtures are controllable and indirect; accent lights along perimeter create mood and provide task lighting; direct daylight into the room that is not overpowering. Second room only has daylight that passes through the check-in area. Parabolic lenses good option for ambient light. Furnishings – Are the furnishings appropriate to the task? Do they allow adjustment for height, size? 4/3 Seating in first room is adjustable for computers, not for paper tests. Some seating in second room is not mobile or adjustable. Tables are not height adjustable. Space-planning – Are the traffic patterns and flow smooth and easy to navigate? Is there good use of space? 3/3 Both rooms are over-crowded, with small spaces to navigate between paper seats and computers may create worry about exit; seating with back to door is undesirable. Way-finding – Are the locations of entrances, exits, and the order of processing in and out clear? 5/4 Good signage above testing room door, at counter and entrance/exits to testing rooms. Multiple doors in testing room that are blocked or have signs to not use could be confusing. Code compliance and other items particular to that site. 4/4 Fire Marshall suggested one less row of paper seats for safety; blocked doors Density –square feet/person 18.9 / 17.7 907.7 ft2 / 48 seats and 612.2 ft2 / 35 seats EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 82 Student Union Building (SU 323) This testing center is designated SU 323 and is located on the third floor of the J. Ferrell Shepard Student Union Building. Image 61 South entrance to Student Union EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 83 Entrance to the testing center. Check-in counter. Image 62 Entrance to SU 323 testing center Image 63 Check-in counter for SC 323 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 84 View of testing room from southwest corner near entrance. View of testing room from southeast corner. Image 64 View of SU 323 testing room from SW Image 65 View of SU 323 testing room from SE EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 85 View of testing room from northeast corner. View of testing room from northwest corner near exit. Image 66 View of SU 323 testing room from NE Image 67 View of SU 323 testing room from NW EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 86 Computer, monitor, keyboard, mouse, and headphones used for testing. Each computer is on a separate table Image 68 Computer used in SU 323 testing room Image 69 Workstation used in SU 323 testing center EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 87 Seating in the testing room. Storage for student materials located next to the check-in counter. Image 70 Seating used in SU 323 testing room Image 71 Student storage for SU 323 EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 88 Light fixtures in the testing room. Daylight on east wall of the testing room. Image 72 Lighting used in SU 323 testing room Image 73 Daylight in SU 323 testing room EFFECTS OF INTERIOR DESIGN ON COMPUTER-BASED TESTING AT WSU APRIL/2013 Carrie McCloud – Master of Professional Communication Thesis, Weber State University 89 Shepherd Student Union – SU 323 Table 10 –Design Analysis for SU 323 Item Rating Comments Adherence to core principles of design – Are the principles of design present and appropriately applied: emphasis, rhythm, proportion, harmony, scale? 5 Student seating is focal point, rhythm and harmony created by color continuity; windows, doors, and seating creates balance, furniture is to scale and quantity is appropriately proportionate to the space. Lighting – Are there three types of lighting: ambient, task, and accent? Is there daylight? Is the quantity of light appropriate? 5 Ceiling fixtures are appropriate indirect fluorescent; bank of east windows provides daylight with shades to minimize glare. Furnishings – Are the furnishings appropriate to the task? Do they allow adjustment for height, size? 4 Seating is adjustable, tables are not, but each computer has separate 3’ section. Space-planning – Are the traffic patterns and flow smooth and easy to navigate? Is there good use of space? 5 Excellent entrance and egress, no bottlenecks; large check-in counter; vestibule creates sound privacy. Way-finding – Are the locations of entrances, exits, and the order of processing in and out clear? 5 Excellent external signage on glass wall to identify testing center; clearly marked entrance and exit to testing room. Code compliance and other items particular to that site. 5 No code issues. Density –square feet/person 23.0 ft2 968 ft2 / 42 seats |
Format | application/pdf |
ARK | ark:/87278/s6e00sy7 |
Setname | wsu_smt |
ID | 96738 |
Reference URL | https://digital.weber.edu/ark:/87278/s6e00sy7 |