Abstract

In response to calls for engineering programs to better prepare students for future careers, many institutions offer courses with a design component to first-year engineering students. This work proposes that traditional exam-based assessments of design concepts are inadequate, and alternative forms of assessment are needed to assess student learning in design courses. This paper investigates the self-efficacy differences between a traditional exam and a two-part practicum as a mid-semester assessment for introductory engineering students enrolled in a first-year design course. Increased self-efficacy has been linked to various positive student outcomes and increased retention of underrepresented students. The practicum consisted of an in-class team design task and an out-of-class individual reflection, while the exam was a traditional, individual written exam. All students completed a pre-assessment survey and a post-assessment survey, both of which included measures of design self-efficacy. Analysis showed that the practicum increased the design self-efficacy of students more effectively than the exam. Students who identified as women had greater gains in design self-efficacy during the practicum as compared with men. Identifying as a minority subgroup student was also trending toward being a significant predictor of change in design self-efficacy for the practicum. Findings suggest that a mid-semester practicum is a successful assessment of design competencies that contributes to increased first-year engineering student self-efficacy.

Introduction

Many first-year engineering design courses are taught using a project-based learning (PBL) approach, the theory behind which posits that students learn more through experiential, hands-on, and open-ended challenges than through rote coursework [1]. PBL structured first-year engineering design courses have been traditionally difficult to assess due to the subjective nature of the material [2] even though they offer many benefits to students, instructors, and institutions. For instance, Abdulaal et al. [3] contend that PBL is one of the best ways to teach design because it is correlated with high student achievement and satisfaction. Furthermore, first-year students self-report many positive benefits after completing a team based PBL design course [46]. Both design and problem-solving based projects have been taught successfully using PBL in a variety of class structures including case studies, single project, and multi-project structures [7]. PBL can also be easily adjusted to meet the requirements of the university, program, instructor, and individual students on a per-semester basis [810].

While PBL and other experiential pedagogical strategies for engineering are widely accepted in the design and engineering communities, implementation of these courses is not without challenges [7,9]. Disadvantages include costs in terms of resources, instructor preparation time, and smaller classes [7,9]. PBL is most successfully taught with smaller class sizes, which requires more instructors [7,9]. It has also been suggested that PBL design courses require a large number of faculty who are interested and capable of teaching design, a resource which may not exist within all engineering departments [7]. While the costs might be offset by the retention of more engineering students, semester-long PBL courses may not be feasible for all institutions [7]. At institutions where a semester-long PBL course is impractical, a practica could be implemented. Design practica are typically structured as open-ended design problems with a hands-on component and are limited in duration, lasting from an hour to a few days. Practica can offer the benefits of a semester-long PBL course but at a lower cost because practica share many of the same core principles as PBL and are shorter in duration.

The rapid increase in PBL adoption was largely driven by the perceived misalignment between engineering educational practices and the needs of 21st century engineering [1113]. For example, Tryggvason and Apelian [11] argue that engineering education needs modification and revision because technology has changed the engineering environment. As industry and engineering practice evolve within a shifting technological landscape, it is paramount that the education and training of engineering students keep pace [12,14]. For example, technology has facilitated a drastic increase in the globalization of businesses and engineering graduates must be able to navigate that environment [11]. Engineering employers want graduates to have technical breadth, good theoretical understanding, and practical application skills [13]. In addition to technical skills, it has been suggested that the modern engineer requires innovation [12,13] and creativity skills [12,13], along with strong social [12,13] and interpersonal skills [13,14]. Engineering graduates must be able to communicate effectively [12,13], successfully work in teams [12,13], have business skills beyond commercial sensitivity [13], and be able to manage ambiguity and uncertainty [14,15]. As part of this movement for engineering education revision, some colleges now offer hands-on first-year engineering design courses to help satisfy engineering industry expectations and meet the needs of students.

This work specifically focuses on the assessment of first-year engineering design courses, which have a broad variety of purposes. Primarily, the intent of these courses is to provide students with a realistic perspective on engineering practice via hands-on project-based courses early in their academic careers [3,8,1619]. However, these courses are also often used to raise enthusiasm for engineering [8,10,2022], create motivation for students to persist in their major [21], and increase students’ confidence [3,16,17,22]. First-year design course are not only well intended but also result in many benefits to the students. Sheppard and Jenison [16] noted that almost all first-year engineering design courses exampled in their paper helped to improve the communication skills and teamwork proficiencies of students. Similarly, other researchers found that verbal, visual, and written communication had been improved by first-year design courses [3,7,20]. Fundamental engineering skills including mathematical modeling [3,16], creativity [23,24], problem-solving [9,10,20], critical thinking abilities [10,22,25], and prototyping were also improved by students [16,18]. Moreover, students developed an awareness of local and global challenges and a fervor for lifelong learning [3]. Furthermore, particular skills can also be cultivated when design courses have specific foci. For example, students became more reflective [26,27], had an increased awareness of the social impact of design [7], and had a perceived increase in their ability to solve multidisciplinary problems when special emphasis was put on these topics [28].

Not only do these courses benefit students but universities and programs also gain from offering such courses. Researchers determined that problem-based first-year engineering courses improved student retention in engineering, with significant gains in the retention of women and minority status engineers, and they found enhanced student satisfaction, diversity, and student learning [3,7,22,29]. These results are valuable and pertinent because other research has shown that engineering students’ motivational constructs decrease throughout the first year and first-year women engineers have more negative attitudes toward engineering when compared with men [30,31]. Due to the benefits of PBL-based first-year engineering design courses, it is important to improve assessment techniques in these course so that instructors can adequately and accurately evaluate student learning, while also building a better understanding of the impact of such assessments. Improved assessment methods will allow instructors to more easily evaluate students’ progress and improve their PBL-based first-year engineering design courses.

Literature Review

The study presented in this work will examine two types of assessments used in one PBL structured first-year engineering design course. Different types of assessments were explored due to the difficulty of assessing subjective skills, like engineering design, using traditional exam-based methods [2]. For the remainder of this paper, the term PBL will refer to the style of PBL used for semester-long engineering courses, while practica will refer to an assessment structured to reflect aspects of PBL.

Due to the aforementioned benefits of PBL, we hypothesized that a PBL-based practicum assessment would better align with the philosophy of the course and increase student self-efficacy when compared with a traditional written exam. This hypothesis is supported by the results of Stolk et al. [32], in which they conclude that students had more positive intrinsic motivation when participating in non-traditional (e.g., project based) or mixed pedagogy STEM courses, which is partially attributed to their basic self-efficacy need being satisfied. This paper will review literature on the assessment of design courses and student self-efficacy related to design before detailing the methods, results, and implications of this particular study.

Assessment of Design.

This study was conducted at a large mid-Atlantic university within a large cornerstone engineering design course for first-year students [33]. The course had conventionally been assessed using exams and as such, the alternative form of assessment being explored, the practicum, was compared with the exam. The use of written exams has traditionally been prevalent in engineering education as a method for evaluating the engineering competencies of students [34]. Rompelman [34] stated that written examinations are a part of the engineering educational culture. Exams became popular because, when designed well, they easily assess whether students have achieved predetermined educational goals [34]. Moreover, exams are considered “objective,” controlled, quantitative, and standardized [2,34]. Though some argue that engineering design can be assessed using exams [35], exams have disadvantages, which tend to be magnified within engineering project-based courses [2]; traditional written exams are prone to being inadequate for evaluating subjective skills, like design, and do not often predict students’ design performance well [2]. For example, subjective grading is normally conducted for design because design lacks conclusive experiments and design tasks rarely have right or wrong answers [2]. Due to this, educators are striving to develop assessment techniques for engineering design that evaluate engineering design skills more effectively while retaining the advantages that exams offer [2,36,37].

Some of these alternative assessment techniques include design juries, peer evaluations, and journals [10]. While instructors may use papers or presentations to assess students [38], some engineering programs have opted for a practica structure when assessing first-year engineering design courses. Commonly, practicum assessments are designed as in-class assessments (with a duration of a few hours), but they can also be developed as take-home exams (i.e., tens of hours to days). Practicum assessments align well with a PBL teaching style and they have more shared aspects with an exam than papers or presentations do (e.g., controlled). In a practicum, students demonstrate their mastery of material by solving an open-ended and realistic design problem, rather than solving a paper-and-pencil test [39]. For instance, students may be tasked with designing a solution for overcrowded animal shelters or finding a way to increase access to medical care in rural areas. These assessments provide an engaging experience for students and afford students a glimpse of what real-world engineering work may be like, which is unachievable with traditional exams. Practicum-style assessments align particularly well with courses that utilize a PBL format since they rely on many of the same core principles as a semester-long PBL course (e.g., emphasis on hands-on and open-ended problems). When used with traditional lecture-based courses, adding a practicum assessment incorporates a beneficial hands-on component, and when used within a design course, the use of practica better aligns the method of assessment with learning theory [39].

While the benefit of first-year engineering design programs is well established [7,40,41], the extent to which the assessment style used in first-year design classes effects student's psychological outcomes—particularly self-efficacy—has yet to be explored. Without appropriate assessment techniques, it is difficult for educators to evaluate the strength of the program or curriculum.

Self-Efficacy Theory Applied to First-Year Design.

Self-efficacy is defined as the confidence a person has in their ability to complete a prospective skill-specific task [42] and is affected by vicarious experiences, social persuasion, and physiological state [4244]. The construct of self-efficacy has been found to influence students’ motivation [45,46], outcome expectancy [46], anxiety [46], persistence [47], grades/GPA [45,48], mathematical achievement scores [48], and perceived career options [47]. For first-year students, academic self-efficacy correlates with students’ classroom performance, stress, health, satisfaction, and commitment to continue their education [49]. In engineering, self-efficacy theory has been applied to predict retention and persistence of various groups of students [50]; for example, self-efficacy is a predictor of women's persistence in a field traditionally dominated by men, like engineering [51,52].

Competency-specific self-efficacy measures have also been created in engineering, such as writing self-efficacy [53,54] and design self-efficacy [46]. The measure for writing self-efficacy has been used to better understand how engineering graduate students conceptualize and relate to the writing process [54]. Furthermore, this measure has been used to show that the likelihood that a graduate student will pursue a certain engineering career after graduation and writing self-efficacy are linked [53]. Self-efficacy in design tasks can best be evaluated using a scale developed and validated by Carberry et al. [46], which measures a respondent's design self-efficacy, motivation, outcome expectancy, and anxiety throughout the design process. The design self-efficacy measure developed by Carberry et al. [46] has mainly been used in engineering education to study students’ design self-efficacy. For example, one recent study found that students who were more involved in academic makerspaces demonstrated greater engineering design self-efficacy [55].

Bandura [43,44] notes that mastery experiences are critical to the development of self-efficacy beliefs [56], and prior work has linked mastery experiences with increased self-efficacy in first-year engineering students. Mastery experiences have previously been found to have the greatest influence on self-efficacy but contextual factors can mitigate the strength of the effect (e.g., gender and ethnicity) [57,58]. As such, this study explores whether an authentic practicum assessment (serving as a mastery experience) effects students’ design self-efficacy. While it cannot be guaranteed that students will perceive the practicum assessment as a mastery experience, the practicum is intentionally designed to be conducive to a mastery experience and is thus more likely to be perceived as such by students. Here, we use the term authentic to denote an educational experience which emulates attributes of an experience common in industrial practice [59]. We posit that authentic design practica serve as more effective mastery experiences for first-year students as compared to traditional written exams. Measuring change in self-efficacy as a result of assessment gives unique insight into the development of students’ skills and can point toward best practices in assessments of PBL courses, such as first-year design.

Research Questions

As first-year design courses become more prevalent, it becomes increasingly important for educators to identify a formal assessment method that is both capable of evaluating subjective design skills and serves as an exciting, empowering introduction to engineering for students from all backgrounds. Specifically, this paper addresses two research questions:

  1. Is there a difference in the self-efficacy change experienced by students who take exams and those who take practica?

  2. If changes in student self-efficacy are observed, are they influenced or dependent on the demographic traits of the students?

To address these questions, this paper compares student self-efficacy differences between first-year engineering students who were assessed via a traditional exam or an authentic design practicum developed at a large mid-Atlantic university. Both assessments were used for summative mid-semester assessment. The practicum is compared with the exam because the exam was the previous standard of assessment for the course in which this study took place. It is likely that a practicum structured with the same principles used in PBL would have similar positive psychological effects unlike a non-project-based assessment (e.g., exams). We will discuss the impact of practica-based assessments on first-year engineering student design self-efficacy in comparison with traditional paper exams.

Methods

An exam and practicum were developed for introductory engineering design courses as summative mid-semester assessments at a large mid-Atlantic university. Both assessments were designed to assess core design competencies covered in the course. A total of 46 students completed the exam and 50 students participated in the practicum for this institutional review board (IRB) approved study. All students in the course completed the associated data collection instruments but only the data for the students who consented to the study is presented here. Students were informed before consenting to the study that their participation (or choice not to participate) in the study would not impact their grade. Demographic information for the participants is provided in Table 1. The demographic make-up between the two groups (i.e., the exam and the practicum) is not balanced because separate course sections received the two assessments.

Table 1

Demographic information (age, gender, race/ethnicity) for participants, practicum (n = 50), and exam (n = 46)

Age (years)PracticumExam
1704
182430
19237
2023
2110
2501
3201
GenderPracticumExam
Male2838
Female228
RacePracticumExam
White3428
Middle Eastern or Native African51
Black or African American11
Asian66
Hispanic, Latino, or Spanish Origin02
Multiple selections48
Age (years)PracticumExam
1704
182430
19237
2023
2110
2501
3201
GenderPracticumExam
Male2838
Female228
RacePracticumExam
White3428
Middle Eastern or Native African51
Black or African American11
Asian66
Hispanic, Latino, or Spanish Origin02
Multiple selections48

Both the exam and the practicum were intended to serve as mid-semester formative assessments. Specifically, these evaluations were intended to assess student's knowledge of, and ability to apply, the design process and tools that they were taught earlier in the course. Learning objectives of this course included the ability to apply engineering design to address design opportunities; the development, use, and application of systems thinking to engineering design; the development of professional engineering skills; the ability to effectively communicate engineering concepts and designs; and providing the opportunity for students to gain experience in hands-on fabrication while developing a “maker” mindset. The exam was constructed to be 2 hours in length and consisted of multiple choice and short answer questions addressing various portions of the engineering design process. The questions in the exam easily assessed whether students were able to apply engineering design to various design opportunities or if they had developed a systems thinking mindset; however, the exam was less useful in determining if students could effectively communicate engineering concepts and did not provide hands-on experience. The exam was completed individually in-class. It was used in comparison with the practicum because it had been the previous assessment standard for the course in which this study took place.

In contrast, the practicum was designed to engage students in an authentic design task. The practicum involved an in-class team design activity approximately 2 hours in length and an out-of-class individual reflection (students who participated in the exam did not complete this reflection) completed within the 48 hours following the practicum. Students worked in teams to facilitate collaborative design. Teamwork is considered a critical skill in engineers and the use of teamwork reinforced for students that it is uncommon for designers to work alone in industry [16,5963]. The structure of the practicum allowed for the instructors to more comprehensively assess if students were meeting all of the learning objectives of the course and also offered an additional opportunity for hands-on fabrication.

Students who were assessed via the practicum were directed to design a product or service for the residents of Cape Town, South Africa that would aid the residents in their water conservation efforts by using the design processes, methods, and tools previously taught in class. The prompt the students who participated in the practicum were given is as follows:

“Cape Town, a coastal city in South Africa, is running out of water. This is the result of a combination of factors, including thriving agriculture, a large population, and a regional drought. They estimate that by July 2018 there will be no more fresh water in their reservoirs. The government of South Africa has declared this a national disaster. It is obvious that Cape Town is in need of long-term solutions that will allow the large population to live sustainably. However, your task is much more short-term in nature. Design a product or service that will aid in water conservation efforts, helping the residents of Cape Town to extend their available water as long as possible”

By having students design a product with social impact for a real-life crisis, this prompt incorporated the use of global perspectives and provided students with an empathetic engineering experience [6466]. Students were given 48 hours after completing the in-class activity (i.e., the 2 hour team practicum) to submit an individual reflection, in which they considered the design process used by their team, discussed any gaps in this process, and analyzed decisions made during the design process. This style of reflection encouraged desirable learning outcomes by prompting metacognition [67]. The reflection also ensured that each student completed a deliverable as part of the assessment and aided the instructor in individually assessing each student's learning.

The experimental procedure is summarized in Fig. 1. Students completed the pre-assessment survey in class before completing the exam or participating in the practicum. The pre-assessment survey consisted of queries regarding age, gender, race/ethnicity, and degree of preparation for the assessment. The engineering design self-efficacy instrument developed by Carberry et al. [46] was also included in the pre-assessment. This instrument asks participants to rate their confidence in their ability to complete each of nine common tasks associated with engineering design, such as “construct a prototype,” on a scale from 0 to 100 [46].

Fig. 1
Experimental procedure. Both the exam and the practicum were 2 hours in duration and all out-of-class work was completed within 48 hours of the in-class session ending.
Fig. 1
Experimental procedure. Both the exam and the practicum were 2 hours in duration and all out-of-class work was completed within 48 hours of the in-class session ending.
Close modal

The post-assessment survey was taken by the exam students within 48 hours of completing the exam and the practicum students took it after completing their individual reflections (∼48 hours after completing the in-class task). This post-assessment survey included (1) the original Carberry et al. [46] engineering design self-efficacy instrument, and (2) a modified version of the engineering design self-efficacy instrument which requested that students rate the change in their confidence in their ability to complete each of nine engineering design tasks. The modified instrument used a scale from −50 (indicating a decrease) to 50 (indicating an increase). This enabled the evaluation of students’ perceived change in self-efficacy. For this study, computed self-efficacy will be used to refer to self-efficacy measures reported through the Carberry et al. [46] survey and perceived self-efficacy will refer to the measures taken from the supplemental survey in which students rated the perceived change in their degree of confidence for each of the self-efficacy measures.

Both assessments were delivered across two sections of the course taught by different instructors. Instructor A identifies as a white man, and, at the time of the exam and practicum, he was 28 years old. Instructor B identifies as a white woman and, at the time of the exam and practicum, she was 27 years old. The exam was used by both instructors during the initial semester and the practicum was administered by both instructors in the following semester.

Results

Responses (0–100) to the Carberry et al. [46] survey were summed for each student to determine an overall computed design self-efficacy score (maximum score = 900). Pre-assessment computed design self-efficacy scores (Carberry et al. [46] survey) were compared for the practicum and the exam to ensure that there were no initial differences between the two groups of students using an independent two-group Mann–Whitney U test (Z = −1.794, p > 0.05, r = 0.18). The pre-assessment and post-assessment computed design self-efficacy scores were compared using dependent two-group Wilcoxon sign rank tests for the exam and for the practicum separately (Fig. 2). There was no significant difference between pre- to post-assessment computed design self-efficacy scores for students who took the exam, Z = −0.127, p > 0.05, r = 0.019, but a significant difference was found from pre- to post-assessment for the students who participated in the practicum, Z = −3.514, p < 0.001, r = 0.497. Students who took the exam had no change in computed self-efficacy, but students who took the practicum experienced an increase in computed design self-efficacy.

Fig. 2
Comparison of pre-assessment to post-assessment computed design self-efficacy scores for exam students and practicum students. Error bars represent ± 1 standard error.
Fig. 2
Comparison of pre-assessment to post-assessment computed design self-efficacy scores for exam students and practicum students. Error bars represent ± 1 standard error.
Close modal

The Pearson's product-moment correlation between perceived change in design self-efficacy and computed change in design self-efficacy was determined for both the practicum and the exam, separately. Change in computed design self-efficacy was the difference between pre- and post-assessment scores for the design self-efficacy instrument (i.e., the Carberry et al. [46] survey). While the score for perceived change in design self-efficacy was determined by summing the responses (−50 to 50) from the modified version of the design self-efficacy instrument for each student (maximum score = +450). For the exam, it was found the perceived change in design self-efficacy was not correlated to the computed change in design self-efficacy, t(44) = 1.172, p > 0.05. Conversely, the perceived change in design self-efficacy for the practicum was correlated to computed change in self-efficacy for the practicum, t(48) = 3.253, p < 0.01. Perceived change in design self-efficacy was not correlated to computed change in design self-efficacy for the exam but was correlated for the practicum.

Independent two-group Mann–Whitney U tests were used to compare the perceived change in design self-efficacy for the exam and practicum students and the change in computed self-efficacy for the exam and practicum students. Results are provided in Fig. 3. There was a significant difference between perceived change in self-efficacy when comparing the exam and practicum students’ reports, Z = −6.467, p < 0.001, r = 0.660. The perceived change in design self-efficacy was higher for the practicum students. There was also a significant difference between the change in computed design self-efficacy for the practicum and the exam students (Z = −2.496, p < 0.05, r = 0.255). There was a greater increase in computed design self-efficacy for students who took the practicum compared with students who took the exam. Students who took the practicum had a greater increase in both computed and perceived changes in design self-efficacy when compared to students who took the exam.

Fig. 3
Comparison of perceived and computed change in design self-efficacy scores for exam and practicum students. Error bars represent ± 1 standard error.
Fig. 3
Comparison of perceived and computed change in design self-efficacy scores for exam and practicum students. Error bars represent ± 1 standard error.
Close modal

Linear regression models were used to find predictors of perceived and computed design self-efficacy for the exam and the practicum (Fig. 4). For students taking the exam, the gender of the students and instructor were not predictors of computed design self-efficacy (F(3,42) = 1.317, p > 0.05). This model for the exam only predicted 2% (adjusted R2) of the variance in the change in computed design self-efficacy scores. However, for students who took the exam, gender was a statistically significant predictor of perceived change in design self-efficacy (F(3,42) = 8.75, p < 0.001) and there was also a significant interaction of gender and instructor. This model for the exam predicted 34% (adjusted R2) of the variance in perceived change in design self-efficacy; women perceived greater gains in design self-efficacy. The women students in Instructor A's course reported greater gains in perceived design self-efficacy, while the men students in Instructor B's course reported greater gains in perceived design self-efficacy. The instructor and the gender of the students were not good predictors of change in computed self-efficacy for students who took the exam. However, gender was a good predictor of perceived change in self-efficacy for students who took the exam (women had greater gains in self-efficacy) and there was an interaction effect of gender of the student and instructor for perceived change in self-efficacy.

Fig. 4
Perceived change in design self-efficacy for the practicum (a), change in computed design self-efficacy for the practicum (b), perceived for the exam (c), and computed for the exam (d)
Fig. 4
Perceived change in design self-efficacy for the practicum (a), change in computed design self-efficacy for the practicum (b), perceived for the exam (c), and computed for the exam (d)
Close modal

For students who took the practicum, gender was not a statistically significant predictor of change in computed (F(3,46) = 2.117, p > 0.05) design self-efficacy but it was a significant predictor of change in perceived (F(3,46) = 3.513, p < 0.05) design self-efficacy. This practicum model for change in computed self-efficacy predicted 6% (adjusted R2) of the variance and the practicum model for perceived change in self-efficacy predicted 13% (adjusted R2) of the variance. Figure 4 shows that women have greater perceived changes in design self-efficacy for the practicum. An independent two-group Mann–Whitney U test was used to compare the perceived design self-efficacy gains by women for the exam and the practicum (Z = −2.679, p < 0.001, r = 0.273), and results show that women in the practicum had significantly higher gains in perceived design self-efficacy compared to women who took the exam. Gender was not a predictor of change in computed design self-efficacy, but it was a good predictor of change in perceived design self-efficacy. It was found that women who participated in the practicum had greater gains in perceived design self-efficacy than women who took the exam.

Linear regression models were also used to investigate if race/ethnicity was a predictor of design self-efficacy for students who participated in the practicum and the exam. A relatively small sample of students identified as minorities for both the practicum and the exam (see Table 1). It was found that when minority subgroups were collapsed, a minority declaration was not a statically significant predictor of computed (F(1,48) = 2.331, p > 0.05, adjusted R2 = 3%) or perceived (F(1,48) = 0.653, p > 0.05, adjusted R2 = 1%) change in design self-efficacy for the practicum. However, it is believed that the change in computed design self-efficacy (ƒ2 = 0.049) is trending toward significance for the practicum; identifying as a minority may predict greater gains in self-efficacy. It is likely that the effect would be significant if there were more minority status students in the sample as indicated by the small effect size. Collapsed minority declaration was also not a significant predictor of computed (F(1,44) = 0.146, p > 0.05, adjusted R2 = −2%) or perceived (F(1,44) = 0.342, p > 0.05, adjusted R2 = −1%) change in design self-efficacy for students who took the exam. Minority status was not a good predictor of perceived or computed change in self-efficacy for students who took the exam. It was also not a good predictor of perceived change in self-efficacy for students who participated in the practicum. However, it is trending toward significance for gains in computed change in self-efficacy for minoritized students who participated in the practicum.

When white and minority subgroups that consisted of at least 10% (Asian and Middle Eastern/Native African for the practicum; Asian for the exam) of the total participants were used as predictors, it was found that for the practicum (see Fig. 5), declarations of white or Middle Eastern/Native African were statistically significant predictors of change in computed design self-efficacy, F(3,46) = 3.095, p < 0.05, adjusted R2 = 11%, and a status of Asian was trending towards significance (d = 0.261). There were no significant race/ethnicity predictors of perceived change in design self-efficacy for the practicum when another linear regression model was used, F(3,46) = 0.138, p > 0.05, adjusted R2 = 1%. Subgroup results for race/ethnicity are provided in Fig. 5 for the practicum. There were no significant predictors of computed (F(2,43) = 0.410, p > 0.05, adjusted R2 = −3%) or perceived (F(2,43) = 0.001, p > 0.05, adjusted R2 = −5%) change in design self-efficacy for the exam when minority subgroups were investigated. Minority identity was not a good predictor of change in perceived or computed self-efficacy for students who took the exam or perceived change in self-efficacy for students who participated in the practicum. However, identifying as a minority subgroup was a good predictor of change in computed self-efficacy for students who participated in the practicum.

Fig. 5
Race/ethnicity results for perceived and computed change in design self-efficacy. Error bars represent ± 1 standard error.
Fig. 5
Race/ethnicity results for perceived and computed change in design self-efficacy. Error bars represent ± 1 standard error.
Close modal

Discussion

All main results for this study are summarized in Table 2. When determining if there was a difference in the self-efficacy change experienced by students who take exams and those who take practicum (i.e., the first research question), this study found that students’ design self-efficacy did not increase from pre- to post-assessment when they took a traditional exam-based assessment. However, an increase in design self-efficacy was observed in students who took a two-part practicum. The practicum included a team design activity and individual reflection. It was also found that students who had participated in the practicum had much greater increases in computed and perceived changes in design self-efficacy when compared with students who took the exam. These results indicate that the design practicum was a more effective assessment tool for increasing students’ design self-efficacy than a traditional exam. This conclusion is further supported by perceived change in design self-efficacy being correlated to computed change in design self-efficacy for the practicum but not for the exam. As expected, these findings align well with work by Stolk et al. [32] which concluded that non-traditional STEM courses increase intrinsic motivation of students likely due in part to high student self-efficacy in these courses.

Table 2

Summary of results

TestAssessmentResult
Pre- to post-assessment computed DSE (Fig. 2)ExamNo significant differences from pre- to post-assessment.
PracticumSignificant increases from pre- to post-assessment.
TestDSEResults
Change in DSE (Fig. 3)PerceivedPracticum students had significantly greater increases when compared with exam students.
ComputedPracticum students had significantly greater increases when compared with exam students.
TestPredictorResults
Gender and instructor as preditors of design self-efficacy (Fig. 4)GenderExam: gender was a significant predictor of perceived change in DSE but did not predict computed change in DSE.
Practicum: gender was a significant predictor of perceived change in DSE but did not predict computed change in DSE.
InstructorExam: instructor was not a significant predictor for computed or perceived DSE.
Practicum: instructor was not a significant predictor for computed or perceived change in DSE.
InteractionExam: there was a significant interaction of gender and instructor for perceived change in DSE but not for computed change in DSE.
Practicum: there was no interaction effect for computed or percieved change in DSE.
TestDSEResults
Collapsed minority declarationExamNot a significant preditor of perceived or computed change in DSE.
PracticumNot a significant predictor of perceived change in DSE but may be trending toward a significant predictor of computed change in DSE.
Minority subgroup declaration (Fig. 5)ExamNot a significant predictor of percieved or computed change in DSE.
PracticumDeclarations of white or Middle Eastern/Native African were statistically significant predictors and a decalartion of Asian was trending toward being a significant predictor of change in computed DSE. No minority subgroups were significant predictors of perceived change in DSE.
TestAssessmentResult
Pre- to post-assessment computed DSE (Fig. 2)ExamNo significant differences from pre- to post-assessment.
PracticumSignificant increases from pre- to post-assessment.
TestDSEResults
Change in DSE (Fig. 3)PerceivedPracticum students had significantly greater increases when compared with exam students.
ComputedPracticum students had significantly greater increases when compared with exam students.
TestPredictorResults
Gender and instructor as preditors of design self-efficacy (Fig. 4)GenderExam: gender was a significant predictor of perceived change in DSE but did not predict computed change in DSE.
Practicum: gender was a significant predictor of perceived change in DSE but did not predict computed change in DSE.
InstructorExam: instructor was not a significant predictor for computed or perceived DSE.
Practicum: instructor was not a significant predictor for computed or perceived change in DSE.
InteractionExam: there was a significant interaction of gender and instructor for perceived change in DSE but not for computed change in DSE.
Practicum: there was no interaction effect for computed or percieved change in DSE.
TestDSEResults
Collapsed minority declarationExamNot a significant preditor of perceived or computed change in DSE.
PracticumNot a significant predictor of perceived change in DSE but may be trending toward a significant predictor of computed change in DSE.
Minority subgroup declaration (Fig. 5)ExamNot a significant predictor of percieved or computed change in DSE.
PracticumDeclarations of white or Middle Eastern/Native African were statistically significant predictors and a decalartion of Asian was trending toward being a significant predictor of change in computed DSE. No minority subgroups were significant predictors of perceived change in DSE.

Note: DSE refers to design self-efficacy.

Although the primary purpose of an evaluation is not to increase self-efficacy, it is an added benefit with substantial effects. The increase in self-efficacy is likely due to the same positive effects of mastery experiences (i.e., authentic successes) as denoted in prior literature [7,4649,51,58]. This is important as design practica provide educators with a way to not only assess design curriculum and student learning, but contribute to the formation of engineering identities and could possibly lead to a higher retention of engineering students [50,52]. An assessment that increases self-efficacy better aligns with the goals and purpose of a first-year design courses, mainly to increase motivation and enthusiasm for engineering [3,8,1618,20,22].

The remainder of the discussion considers findings relevant to research question number two which focused on the influence of demographic traits on changes in self-efficacy. This study found that the change in computed design self-efficacy for the exam was not influenced by gender or instructor. However, perceived change in design self-efficacy for the exam was influenced by gender. Women perceived greater increases in design self-efficacy when compared with men. Increases in women students’ self-efficacy could increase the number of women planning to persist in traditionally male-dominated fields like engineering [51,52]. Taking the exam may have allowed students to more accurately perceive their self-efficacy than they could before the exam [68,69]. In other words, students may have become aware of how much or how little they actually knew about the concepts presented. It is possible that women perceived a greater increase in design self-efficacy because they were more able to accurately access their self-efficacy after the exam. However, it is unknown whether women had lower perceived self-efficacy before the exam because the instrument only asks students to rate their perceived change in design self-efficacy. It is unlikely though that women had lower perceived self-efficacy before the exam because no differences by gender are seen for pre-survey computed self-efficacy scores. Nevertheless, future studies should determine if women students have lower perceived design self-efficacy throughout the course, including before the mid-semester assessment. Previous literature has demonstrated that incoming women students perceive themselves as academically weaker when compared with the perceptions of men students, even though their academic performances were similar [70].

There was also an interaction of gender and instructor for perceived change in design self-efficacy for the exam. Instructor A identifies as a white man and Instructor B identifies as a white woman. The women students in Instructor A's course compared with the men students in Instructor A's course reported greater gains in perceived design self-efficacy. On the contrary, the men students in Instructor B's course compared with the women students in Instructor B's course reported greater gains in perceived design self-efficacy. One possible explanation could be that the students in each class significantly differed in their pre-assessment self-efficacy scores by gender and therefore had different opportunities for gains in self-efficacy. When this possibility was investigated, it was found that there were no pre-survey differences in design self-efficacy for either course by gender, thus is unlikely that students had different opportunities for gains in self-efficacy. It is more likely that differences between Instructor A and Instructor B contributed to this effect. Many aspects of teaching method, teaching style, and instructional strategy have been linked to influencing student self-efficacy [58,71], one of which may also explain this interaction. Due to the considerable differences between Instructor A and Instructor B, no other conclusions could be made to explain the cause of the interaction effect.

For the practicum, gender only influenced perceived change in design self-efficacy and did not influence change in computed design self-efficacy. It was found that women had greater perceived gains in design self-efficacy for the practicum when compared to men students. This may, again, be due to women perceiving themselves as academically weaker before the assessment and more accurately assessing themselves after the assessment, but it is also possible that the mastery experience has a greater influence on women self-efficacy compared with men self-efficacy [57,58,70]. Mastery experiences in historically male disciplines have been found to influence women self-efficacy differently than men self-efficacy [57]. Women also had higher perceived gains in design self-efficacy for the practicum when compared with women who took the exam, most likely due to the mastery experience of the practicum being more influential on self-efficacy than an exam. This is an important finding because it shows that a practicum-style assessment has the ability to increase the self-efficacy of underrepresented groups in engineering.

In addition, it was determined that identifying as certain races/ethnicities can indicate change in computed design self-efficacy scores for the practicum. Furthermore, identifying as a minority subgroup student was also trending toward being a significant predictor of change in computed self-efficacy for the practicum and not for the exam. Previous research suggests that aspects of a mastery experience can have a greater influence on the self-efficacy of certain ethnic groups (e.g., some ethnic groups are influenced by vicarious experience more than others) [57]. It has also been shown that the confidence of ethnic group members develops through different factors than the confidence of white students (e.g., other-orientated versus self-orientated formation of confidence) [57]. Minority status was trending toward significance for the practicum while minority status was not significant for the exam. This could suggest that the team aspect of the practicum caused social persuasion and vicarious experience to contribute to minority status students’ self-efficacy. Previous research has found that minoritized students appreciated learning from and benefited from their teammates during design scenarios [72]. This result highlights the ability of the practicum to further effect underrepresented groups in engineering.

Limitations and Recommendations for Future Research.

This study was constrained by a small and relatively homogeneous sample (in terms of racial diversity) due to the limited diversity of the institution. However, the results based on this sample are promising as they indicate that the practicum could be used to increase self-efficacy of women and may increase the self-efficacy of minority status students. Future work could expand the study to include additional universities and institutions with more diverse student populations to better understand in what environment such an intervention would be the most beneficial. The results also suggest that teaching style or the positionality of the instructor may influence perceived self-efficacy gains for students taking traditional exams. However, because of the small number of instructors in this experiment, these results are anecdotal at best. A future replication study with a larger sample size, greater demographic diversity, and multiple class sections at multiple institutions is necessary to pursue this result further. It is also unknown if these gains in self-efficacy span long durations of time because there were no follow-up surveys distributed to participants. Understanding that our small sample only included two faculty instructors, a replication study should be designed to more fully investigate the effects of instructor identity on student performance and outcomes. Additionally, the replication study should address whether the effects seen from the practicum were due to the team design task or to the individual reflection.

Future work should conduct longitudinal studies with participants to explore if the differences in design self-efficacy are still prevalent after a significant period of time has passed. Furthermore, future work should investigate what parts of the practicum are most important for enabling students to achieve higher self-efficacy. Finally, future work should determine the effects of different types of practicum assessments on student self-efficacy and explore the effects of an individual style practicum rather than a team design task.

Conclusion

This study examined whether and to what extent assessments influence design self-efficacy by comparing a traditional exam and a practicum in a PBL design course for first-year engineering students. The effects of the students’ demographic traits on design self-efficacy for both types of assessments were also investigated. When analyzing pre-assessment and post-assessment measures of self-reported design self-efficacy, students who had participated in the practicum had the greatest gains in computed and perceived design self-efficacy when compared with students who had completed the exam. The practicum also increased women's design self-efficacy more effectively than it did men's design self-efficacy. This suggests that a practicum assessment is more effective in increasing the design self-efficacy of introductory students, especially for students from underrepresented groups, when compared with a traditional exam. An assessment which increases students’ self-efficacy better aligns with the goals and purpose of a first-year design course. Increasing self-efficacy in this way holds significant implications for the long-term retention of students in engineering degree programs.

Acknowledgment

An early version of this work was included in the proceedings of the 2020 ASME IDETC [73].

Funding Data

  • The first author's research assistantship is supported by the Defense Advanced Research Projects Agency through cooperative agreement N66001-17-4064. Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the sponsor.

Conflict of Interest

There are no conflicts of interest.

Data Availability Statement

The datasets generated and supporting the findings of this article are obtainable from the corresponding author upon reasonable request.

References

1.
Savery
,
J. R.
, and
Duffy
,
T. M.
,
1995
, “Problem-Based Learning: An Instructional Model and Its Constructivist Framework,”
Constructivist Learning Environments: Case Studies in Instructional Design
,
B.
Wilson
, ed.,
Educational Technology Publications
,
Englewood Cliffs, NJ
, pp.
135
150
.
2.
Miller
,
S. R.
,
Bailey
,
B. P.
, and
Kirlik
,
A.
,
2012
, “
Towards
,”
119th ASEE Annual Conference and Exposition.
,
San Antonio, TX
,
June 10–13
.
3.
Abdulaal
,
R. M.
,
Al-Bahi
,
A. M.
,
Soliman
,
A. Y.
, and
Iskanderani
,
F. I.
,
2011
, “
European Journal of Engineering Education Design and Implementation of a Project-Based Active/Cooperative Engineering Design Course for Freshmen
,”
Eur. J. Eng. Educ.
,
36
(
4
), pp.
391
402
. 10.1080/03043797.2011.598498
4.
Courter
,
S. S.
,
Millar
,
S. B.
, and
Lyons
,
L.
,
1998
, “
From the Students’ Point of View: Experiences in a Freshman Engineering Design Course
,”
J. Eng. Educ.
,
87
(
3
), pp.
283
288
. 10.1002/j.2168-9830.1998.tb00355.x
5.
Edward
,
N. S.
,
2004
, “
Evaluations of Introducing Project-Based Design Activities in the First and Second Years of Engineering Courses
,”
Eur. J. Eng. Educ.
,
29
(
4
), pp.
491
503
. 10.1080/03043790410001716284
6.
Pajares
,
F.
,
2003
, “
Self-Efficacy Beliefs, Motivation, and Achievement in Writing: A Review of the Literature
,”
Read. Writ. Q.
,
19
(
2
), pp.
139
158
. 10.1080/10573560308222
7.
Dym
,
C. L.
,
Agogino
,
A. M.
,
Eris
,
O.
,
Frey
,
D. D.
, and
Leifer
,
L. J.
,
2005
, “
Engineering Design Thinking, Teaching, and Learning
,”
J. Eng. Educ.
,
94
(
1
), pp.
103
120
.
8.
Ambrose
,
S. A.
, and
Amon
,
C. H.
,
1997
, “
Systematic Design of a First-Year Mechanical Engineering Course at Carnegie Mellon University
,”
J. Eng. Educ.
,
86
(
2
), pp.
173
181
. 10.1002/j.2168-9830.1997.tb00281.x
9.
Bazylak
,
J.
, and
Wild
,
P.
,
2007
, “
Best Practices Review of First-Year Engineering Design Education
,”
Canadian Engineering Education Association (CEEA)
,
University of Manitoba Winnipeg, Manitoba
,
July 22–24
.
10.
Campbell
,
S.
, and
Colbeck
,
C. L.
,
1998
,
American Society for Engineering Education
.
11.
Tryggvason
,
G.
, and
Apelian
,
D.
,
2006
, “
Re-Engineering Engineering Education for the Challenges of the 21st Century
,”
J. Miner. Met. Mater. Soc.
,
58
(
10
), pp.
14
17
. 10.1007/s11837-006-0194-6
12.
Giriyapur
,
A. C.
, and
Kotturshettar
,
B. B.
,
2015
, “
Twenty-First Century Classroom Engineering—Designing Effective Learning Environments: A Conceptual Case Study
,”
Proceedings of the International Conference on Transformations in Engineering Education
,
Bangalore, India
,
Jan. 5–8
.
13.
Spinks
,
N.
,
Silburn
,
N.
, and
Birchall
,
D.
,
2006
,
Educating Engineers for the 21st Century: The Industry View
,
Royal Academy of London
,
Oxfordshire, UK
.
14.
Etter
,
D. M.
, and
Bordogna
,
J.
,
1994
, “
Engineering Education for the 21st Century
,”
IEEE International Conference on Acoustics, Speech and Signal Processing
,
IEEE
,
New York, NY
.
15.
Mistree
,
F.
,
2013
, “
Strategic Design Engineering: A Contemporary Paradigm for Engineering Design Education for the 21st Century?
ASME J. Mech. Des.
,
135
(
9
), p.
090301
. https://doi.org/10.1115/1.4025125
16.
Sheppard
,
S.
, and
Jenison
,
R.
,
1997
, “
Examples of Freshman Design Education
,”
Int. J. Eng. Educ.
,
13
(
4
), pp.
248
261
.
17.
Carlson
,
B.
,
Schoch
,
P.
,
Kalsher
,
M.
, and
Racicot
,
B.
,
1997
, “
A Motivational First-Year Electronics Lab Course
,”
J. Eng. Educ.
,
86
(
4
), pp.
357
362
. 10.1002/j.2168-9830.1997.tb00309.x
18.
Wood
,
K. L.
,
Jensen
,
D.
,
Bezdek
,
J.
, and
Otto
,
K. N.
,
2001
, “
Reverse Engineering and Redesign: Courses to Incrementally and Systematically Teach Design
,”
J. Eng. Educ.
,
90
(
3
), pp.
363
374
. 10.1002/j.2168-9830.2001.tb00615.x
19.
Frank
,
M.
,
Lavy
,
I.
, and
Elata
,
D.
,
2003
, “
Implementing the Project-Based Learning Approach in an Academic Engineering Course
,”
Int. J. Technol. Des. Educ.
,
13
(
3
), pp.
273
288
. 10.1023/A:1026192113732
20.
Burton
,
J. D.
, and
White
,
D. M.
,
1999
, “
Selecting a Model for Freshman Engineering Design
,”
J. Eng. Educ.
,
33
(
3
), pp.
327
333
. 10.1002/j.2168-9830.1999.tb00454.x
21.
Kramlich
,
J. C.
,
Safoutin
,
M. J.
,
Atman
,
C. J.
,
Adams
,
R.
,
Rutar
,
T.
, and
Fridley
,
J. L.
,
2000
, “
A Design Attribute Framework for Course Planning and Learning Assessment
,”
IEEE Trans. Educ.
,
43
(
2
), pp.
188
199
.
22.
Dally
,
J. W.
, and
Zhang
,
G. M.
,
1993
, “
A Freshman Engineering Design Course
,”
J. Eng. Educ.
,
82
(
2
), pp.
83
91
. https://doi.org/10.1002/j.2168-9830.1993.tb00081.x
23.
Tolbert
,
D. A.
, and
Daly
,
S. R.
,
2013
, “
First-Year Engineering Student Perceptions of Creative Opportunities in Design
,”
Int. J. Eng. Educ.
,
29
(
4
), pp.
879
890
.
24.
Charyton
,
C.
, and
Merrill
,
J. A.
,
2009
, “
Assessing General Creativity and Creative Engineering Design in First Year Engineering Students
,”
J. Eng. Educ.
,
98
(
2
), pp.
145
156
. 10.1002/j.2168-9830.2009.tb01013.x
25.
Savage
,
R. N.
,
Chen
,
K. C.
, and
Vanasupa
,
L.
,
2007
, “
Integrating Project-Based Learning Throughout the Undergraduate Engineering Curriculum
,”
J. Stem Educ. Innov. Res.
,
8
(
3–4
), pp.
15
27
.
26.
Adams
,
R. S.
,
Turns
,
J.
, and
Atman
,
C. J.
,
2003
, “
Educating Effective Engineering Designers: The Role of Reflective Practice
,”
Des. Stud.
,
24
(
3
), pp.
275
294
. 10.1016/S0142-694X(02)00056-X
27.
Chen
,
H. L.
,
Cannon
,
D.
,
Gabrio
,
J.
,
Leifer
,
L.
,
Toye
,
G.
, and
Bailey
,
T.
,
2005
, “
Using Wikis and Weblogs to Support Reflective Learning in an Introductory Engineering Design Course
,”
2005 American Society for Engineering Education Annual Conference & Exposition.
,
Portland, OR
,
June 12–15
, pp.
95
105
.
28.
Telenko
,
C.
,
Wood
,
K.
,
Otto
,
K.
,
Elara
,
M. R.
,
Foong
,
S.
,
Pey
,
K. L.
,
Tan
,
U. X.
,
Camburn
,
B.
,
Moreno
,
D.
, and
Frey
,
D.
,
2016
, “
Designettes: An Approach to Multidisciplinary Engineering Design Education
,”
ASME J. Mech. Des.
,
138
(
2
), p.
022001
. https://doi.org/10.1115/1.4031638
29.
Knight
,
D. W.
,
Carlson
,
L. E.
, and
Sullivan
,
J. F.
,
2003
, “
Staying in Engineering: Impact of a Hands-On, Team-Based, First-Year Projects Course on Student Retention
,”
2003 American Society for Engineering Education Annual Conference & Exposition
,
Nashville Convention Center
,
June 22–25
.
30.
Besterfield-Sacre
,
M.
,
Moreno
,
M.
,
Shuman
,
L. J.
, and
Atman
,
C. J.
,
2001
, “
Gender and Ethnicity Differences in Freshmen Engineering Student Attitudes: A Cross-Institutional Study*
,”
J. Eng. Educ.
,
90
(
4
), pp.
477
489
. 10.1002/j.2168-9830.2001.tb00629.x
31.
Jones
,
B. D.
,
Paretti
,
M. C.
,
Hein
,
S. F.
, and
Knott
,
T. W.
,
2010
, “
An Analysis of Motivation Constructs With First-Year Engineering Students: Relationships Among Expectancies, Values, Achievement, and Career Plans
,”
J. Eng. Educ.
,
99
(
4
), pp.
319
336
. 10.1002/j.2168-9830.2010.tb01066.x
32.
Stolk
,
J. D.
,
Zastavker
,
Y. V.
, and
Gross
,
M. D.
,
2018
, “
Gender, Motivation, and Pedagogy in the STEM Classroom: A Quantitative Characterization
,”
ASEE Annual Conference Expo Conference Proceedings
,
Salt Lake City, UT
,
June 24–27
.
33.
Ritter
,
S. C.
, and
Bilen
,
S. G.
,
2019
, “
EDSGN 100: A First-Year Cornerstone Engineering Design Course
,”
FYEE Conference
,
State College, PA
,
July 28–30
.
34.
Rompelman
,
O.
,
2000
, “
Assessment of Student Learning: Evolution of Objectives in Engineering Education and the Consequences for Assessment
,”
Eur. J. Eng. Educ.
,
25
(
4
), pp.
339
350
. 10.1080/03043790050200386
35.
Schilling
,
W. W.
,
2012
, “
Effective Assessment of Engineering Design in an Exam Environment
,”
ASEE Annual Conference & Exposition.
,
San Antonio, TX
,
July 10–13
.
36.
Bailey
,
R.
, and
Szabo
,
Z.
,
2006
, “
Assessing Engineering Design Process Knowledge
,”
Int. J. Eng. Educ.
,
22
(
3
), pp.
508
518
.
37.
Keefe
,
M.
,
Glancey
,
J.
, and
Cloud
,
N.
,
2007
, “
Assessing Student Team Performance in Industry Sponsored Design Projects
,”
ASME J. Mech. Des.
,
129
(
7
), pp.
692
700
. 10.1115/1.2722791
38.
Mckenzie
,
L. J.
,
Trevisan
,
M. S.
,
Davis
,
D. C.
, and
Beyerlein
,
S. W.
,
2004
, “
Capstone Design Courses and Assessment: A National Study
,”
2004 American Society of Engineering Education Annual Conference & Exposition
,
L. J.
McKenzie
,
M. S.
Trevisan
,
D. C.
Davis
, and
S. W.
Beyerlein
, eds., pp.
1
14
.
39.
McComb
,
C.
,
Berdanier
,
C.
, and
Menold
,
J.
,
2018
, “
Design Practica as Authentic Assessments in First-Year Engineering Design Courses
,”
FYEE Conference
,
Glassboro, NJ
,
July 24–26
.
40.
Freuler
,
R. J.
,
Fentiman
,
A. W.
,
Demel
,
J. T.
,
Gustafson
,
R. J.
, and
Merrill
,
J. A.
,
2001
, “
Developing and Implementing Hands-on Laboratory Exercises and Design Projects for First Year Engineering Students
,”
American Society for Engineering Education Annual Conference & Exposition
,
Albuquerque, NM
,
June 24–27
.
41.
Schaeffer
,
J. A.
, and
Palmgren
,
M.
,
2017
, “Visionary Expectations and Novice Designers–Prototyping in Design Education,” Des. Technol. Educ, https://eric.ed.gov/?id=EJ1137743, Accessed September 18, 2019.
42.
Schunk
,
D. H.
, and
Pajares
,
F.
,
2009
,
Handbook of Motivation at School
,
Rouledge
,
New York
.
43.
Bandura
,
A.
,
1977
, “
Self-Efficacy: Toward a Unifying Theory of Behavioral Change
,”
Psychol. Rev.
,
84
(
2
), pp.
191
215
. 10.1037/0033-295X.84.2.191
44.
Bandura
,
A.
,
1982
, “
Self-Efficacy Mechanism in Human Agency
,”
Am. Psychol.
,
37
(
2
), pp.
122
147
. 10.1037/0003-066X.37.2.122
45.
Ponton
,
M. K.
,
Edmister
,
J. H.
,
Ukeiley
,
L. S.
, and
Seiner
,
J. M.
,
2001
, “
Understanding the Role of Self-Efficacy in Engineering Education
,”
J. Eng. Educ.
,
90
(
2
), pp.
247
251
. 10.1002/j.2168-9830.2001.tb00599.x
46.
Carberry
,
A. R.
,
Lee
,
H. S.
, and
Ohland
,
M. W.
,
2010
, “
Measuring Engineering Design Self-Efficacy
,”
J. Eng. Educ.
,
99
(
1
), pp.
71
79
. 10.1002/j.2168-9830.2010.tb01043.x
47.
Lent
,
R. W.
,
Brown
,
S. D.
, and
Larkin
,
K. C.
,
1986
, “
Self-Efficacy in the Prediction of Academic Performance and Perceived Career Options
,”
J. Couns. Psy.
,
33
(
3
), pp.
265
269
. https://doi.org/10.1037/0022-0167.33.3.265
48.
Loo
,
C. W.
, and
Choy
,
J. L. F.
,
2013
, “
Sources of Self-Efficacy Influencing Academic Performance of Engineering Students
,”
Am. J. Educ. Res.
,
1
(
3
), pp.
86
92
. 10.12691/education-1-3-4
49.
Chemers
,
M. M.
,
Hu
,
L.-T.
, and
Garcia
,
B. F.
,
2001
, “
Academic Self-Efficacy and First-Year College Student Performance and Adjustment
,”
J. Educ. Psychol.
,
93
(
1
), pp.
55
64
. 10.1037/0022-0663.93.1.55
50.
Schaefers
,
K. G.
,
Epperson
,
D. L.
, and
Nauta
,
M. M.
,
1997
, “
Women’s Career Development: Can Theoretically Derived Variables Predict Persistence in Engineering Majors?
,”
J. Couns. Psychol.
,
44
(
2
), pp.
173
183
. 10.1037/0022-0167.44.2.173
51.
Marra
,
R. M.
,
Rodgers
,
K. A.
,
Shen
,
D.
, and
Bogue
,
B.
,
2009
, “
Women Engineering Students and Self-Efficacy: A Multi-Year, Multi-Institution Study of Women Engineering Student Self-Efficacy
,”
J. Eng. Educ.
,
98
(
1
), pp.
27
38
. 10.1002/j.2168-9830.2009.tb01003.x
52.
Verdin
,
D.
, and
Godwin
,
A.
,
2018
, “
Exploring Latina First-Generation College Students’ Multiple Identities, Self-Efficacy, and Institutional Integration to Inform Achievement in Engineering
,”
J. Women Minor. Sci. Eng.
,
24
(
3
), pp.
261
290
. 10.1615/JWomenMinorScienEng.2018018667
53.
Berdanier
,
C. G. P.
, and
Zerbe
,
E.
,
2018
, “
Correlations Between Graduate Student Writing Concepts and Processes and Certainty of Career Trajectories
,”
2018 IEEE Frontiers in Education Conference (FIE)
,
San Jose, CA
,
Oct. 3–6
.
54.
Berdanier
,
C.
, and
Zerbe
,
E.
,
2018
, “
Quantitative Investigation of Engineering Graduate Student Conceptions and Processes of Academic Writing
,”
IEEE International Professional Communication Conference
,
Kansas City, MO
,
May 20–24
.
55.
Hilton
,
E. C.
,
Talley
,
K. G.
,
Smith
,
S. F.
,
Nagel
,
R. L.
, and
Linsey
,
J. S.
,
2020
, “
Report on Engineering Design Self-Efficacy and Demographics of Makerspace Participants Across Three Universities
,”
ASME J. Mech. Des.
,
142
(
10
), p.
102301
. 10.1115/1.4046649
56.
Hutchison
,
M. A.
,
Follman
,
D. K.
,
Sumpter
,
M.
, and
Bodner
,
G. M.
,
2006
, “
Factors Influencing the Self-Efficacy Beliefs of First-Year Engineering Students
,”
J. Eng. Educ.
,
95
(
1
), pp.
39
47
. 10.1002/j.2168-9830.2006.tb00876.x
57.
Usher
,
E. L.
, and
Pajares
,
F.
,
2008
, “
Sources of Self-Efficacy in School: Critical Review of the Literature and Future Directions
,”
Rev. Educ. Res.
,
78
(
4
), pp.
751
796
. 10.3102/0034654308321456
58.
van Dinther
,
M.
,
Dochy
,
F.
, and
Segers
,
M.
,
2011
, “
Factors Affecting Students’ Self-Efficacy in Higher Education
,”
Educ. Res. Rev.
,
6
(
2
), pp.
95
108
.
59.
Han
,
Y.-L.
,
Asme
,
M.
,
Cook
,
K.
,
Mason
,
G.
, and
Shuman
,
T. R.
,
2018
, “
Enhance Engineering Design Education in the Middle Years With Authentic Engineering Problems
,”
ASME J. Mech. Des.
,
140
(
12
), p.
122001
. 10.1115/1.4040880
60.
Baird
,
F.
,
Moore
,
C.
, and
Jagodzinski
,
A.
,
2000
, “
An Ethnographic Study of Engineering Design Teams at Rolls-Royce Aerospace
,”
Des. Stud.
,
21
(
4
), pp.
333
355
. 10.1016/S0142-694X(00)00006-5
61.
Reiter-Palmon
,
R.
,
Herman
,
A. E.
, and
Yammarino
,
F.
,
2008
,
Multi-Level Issues in Creativity and Innovation
, 7th ed.,
M. D.
Mumford
,
S. T.
Hunter
, and
K. E.
Bedell-Avers
, eds.,
Emerald Group Publishing Limited
,
Bingley
, pp.
203
267
.
62.
Paulus
,
P. B.
,
Dzindolet
,
M.
, and
Kohn
,
N. W.
,
2011
, “Collaborative Creativity—Group Creativity and Team Innovation,”
Handbook of Organizational Creativity
,
M. D.
Mumford
, ed.,
Elsevier
,
New York
, pp.
327
357
.
63.
Brown
,
M.
,
2016
, “5 Skills Hiring Managers Look for in Engineering Grads,” website.engineering.com.
64.
Walther
,
J.
,
Miller
,
S. E.
, and
Sochacka
,
N. W.
,
2017
, “
A Model of Empathy in Engineering as a Core Skill, Practice Orientation, and Professional Way of Being
,”
J. Eng. Educ.
,
106
(
1
), pp.
123
148
. 10.1002/jee.20159
65.
Mihelcic
,
J. R.
,
Phillips
,
L. D.
, and
Watkins
,
D. W.
,
2006
, “
Integrating a Global Perspective Into Education and Research: Engineering International Sustainable Development
,”
Environ. Eng. Sci.
,
23
(
3
), pp.
426
438
. 10.1089/ees.2006.23.426
66.
Downey
,
G. L.
,
Lucena
,
J. C.
,
Moskal
,
B. M.
,
Parkhurst
,
R.
,
Bigley
,
T.
,
Hays
,
C.
,
Jesiek
,
B. K.
,
Kelly
,
L.
,
Miller
,
J.
,
Ruff
,
S.
,
Lehr
,
J. L.
, and
Nichols-Belo
,
A.
,
2006
, “
The Globally Competent Engineer: Working Effectively With People Who Define Problems Differently
,”
J. Eng. Educ.
,
95
(
2
), pp.
107
122
. 10.1002/j.2168-9830.2006.tb00883.x
67.
Ford
,
J. K.
,
Gully
,
S.
, and
Salas
,
E.
,
1998
, “
Relationships of Goal Orientation, Metacognitive Activity, and Practice Strategies With Learning Outcomes and Transfer Framework for Improving Training Efficiency and Effectiveness View Project MSU Primary Care Faculty Development Fellowship View Project
,”
Artic. J. Appl. Psychol.
,
83
(
2
), pp.
218
233
. 10.1037/0021-9010.83.2.218
68.
Wood
,
R. E.
, and
Locke
,
E. A.
,
1987
, “
The Relation of Self-Efficacy and Grade Goals to Academic Performance
,”
Educ. Psychol. Meas.
,
47
(
4
), pp.
1013
1024
. 10.1177/0013164487474017
69.
Vrugt
,
A. J.
,
Langereis
,
M. P.
, and
Hoogstraten
,
J.
,
1997
, “
Academic Self-Efficacy and Malleability of Relevant Capabilities as Predictors of Exam Performance
,”
J. Exp. Educ.
,
66
(
1
), pp.
61
72
. 10.1080/00220979709601395
70.
MacPhee
,
D.
,
Farro
,
S.
, and
Canetto
,
S. S.
,
2013
, “
Academic Self-Efficacy and Performance of Underrepresented STEM Majors: Gender, Ethnic, and Social Class Patterns
,”
Anal. Soc. Issues Public Policy
,
13
(
1
), pp.
347
369
. 10.1111/asap.12033
71.
Green
,
D. M.
,
2003
, “
Self-Efficacy: A Communication Model for the Development of Self-Efficacy in the Classroom
,”
J. Teach. Soc. Work
,
23
(
3–4
), pp.
107
116
. 10.1300/J067v23n03_09
72.
Chen
,
P.
,
Hernandez
,
A.
, and
Dong
,
J.
,
2015
, “
Impact of Collaborative Project-Based Learning on Self-Efficacy of Urban Minority Students in Engineering, Journal of Urban Learning, Teaching, and Research, 2015
,”
J. Urban Learn.
,
11
, pp.
26
39
.
73.
Nolte
,
H.
,
Berdanier
,
C.
,
Menold
,
J.
, and
Mccomb
,
C.
,
2020
, “
Comparison of Exams and Design Practica for Assessment in First Year Engineering Design Courses
,”
ASME 2020 International Design Engineering Technical Conferences & Computers in Engineering Conferences
,
Virtual Conference
,
Aug. 17–19
.