scispace - formally typeset
Open AccessProceedings ArticleDOI

Effective user experience in online technical communication courses: employing multiple methods within organizational contexts to assess usability

TLDR
In teaching online technical communication courses, shaping an electronic interface requires extensive consideration of the user experience, both for students and for faculty members who design and teach the courses.
Abstract
In teaching online technical communication courses, shaping an electronic interface requires extensive consideration of the user experience, both for students and for faculty members who design and teach the courses. Technical communication faculty members should provide strong examples of effective user experiences and should be leaders in making the interfaces of online learning management systems as usable as possible.Principles of usability designed for general web sites may or may not apply to learning management systems designed for educational purposes. In order to create effective online technical communication courses, one needs to consider both usability concerns and pedagogical concerns.To assess the usability and pedagogical effectiveness of online courses, faculty members may use indirect means such as heuristic analyses. In addition, they may use direct means such as usability testing, student feedback, and analytic tools. Each approach has advantages as well as limitations. Faculty members will gain the richest information through using multiple approaches.In assessing usability and pedagogical effectiveness, faculty members also need to consider the situational constraints and resources in their unique contexts. Understanding and adapting their approaches to use resources well and to work within constraints will benefit their abilities to enhance their student users' experiences with online courses.

read more

Content maybe subject to copyright    Report

Effective User Experience in Online Technical
Communication Courses: Employing Multiple Methods
within Organizational Contexts to Assess Usability
Marjorie Rush Hovde
Indiana University-Purdue University
Indianapolis
ET 324F, 799 W. Michigan Street
Indianapolis, IN
1-765-274-0825
mhovde@iupui.edu
ABSTRACT
In teaching online technical communication courses, shaping an
electronic interface requires extensive consideration of the user
experience, both for students and for faculty members who design
and teach the courses. Technical communication faculty members
should provide strong examples of effective user experiences and
should be leaders in making the interfaces of online learning
management systems as usable as possible.
Principles of usability designed for general web sites may or may
not apply to learning management systems designed for
educational purposes. In order to create effective online technical
communication courses, one needs to consider both usability
concerns and pedagogical concerns.
To assess the usability and pedagogical effectiveness of online
courses, faculty members may use indirect means such as
heuristic analyses. In addition, they may use direct means such as
usability testing, student feedback, and analytic tools. Each
approach has advantages as well as limitations. Faculty members
will gain the richest information through using multiple
approaches.
In assessing usability and pedagogical effectiveness, faculty
members also need to consider the situational constraints and
resources in their unique contexts. Understanding and adapting
their approaches to use resources well and to work within
constraints will benefit their abilities to enhance their student
users’ experiences with online courses.
Categories and Subject Descriptors
H.5.2 User Interfacesuser-centered design
K.3.1 Computer Uses in Educationdistance learning
General Terms
Human Factors
Keywords
User interface, Learning management systems, Teaching technical
communication, Heuristics, Usability testing, Online course
usability.
1. INTRODUCTION
In online technical communication courses, designing the
interface requires extensive consideration of the user experience
for students and for faculty members who design and teach the
courses. Technical communication faculty members should
provide strong examples of effective user experiences and should
be leaders in making the interfaces of online learning management
systems (LMS) as usable as possible. A usable interface in an
LMS can help to overcome some of these challenges and provide
a worthwhile user experience. Within organizational contexts,
technical communication faculty members need to consider the
resources they might use to enhance usability while also taking
into account the constraints of their situations that might limit the
level of usability they are able to achieve.
2. PEDAGOGICAL AND USABILITY
CONCERNS IN ONLINE COURSES
One challenge in considering usability in online courses is that the
typical principles of usability for online material relate to general
situations, such as those found in commercial websites, but
learning situations have additional dimensions of usability,
beyond those found in web sites designed for other purposes [1].
In online courses, faculty members need to complete tasks such as
“Facilitating discussions, sharing documents, calendaring/
managing the course, and tracking learning progress” [1, p. 3].
Faculty members also arrange for student access to information,
offer social interactions within the course, provide for student
contributions, and allow for student learning experiences. Student
and faculty users in online courses have to manage many levels of
functioning, so a usable interface is a great boon. Learners need to
move from their goals for learning and activities to the design of
the interface, which should not pose a barrier to achieving their
goals. [1].
Indeed, “If the interface is not transparent and easy to use, the
learners/students concentrate on interaction aspects and not on
acquiring content” in online courses [2, p. 107]. In addition, the
levels of usability and satisfaction may depend on students’ IT skills
Permission to make digital or hard copies of part or all of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. Copyrights
for third-party components of this work must be honored. For all other
uses, contact the Owner/Author.
Copyright is held by the owner/author(s).
SIGDOC '15, July 16-17, 2015, Limerick, Ireland
ACM 978-1-4503-3648-2/15/07.
http://dx.doi.org/10.1145/2775441.2775453
_________________________________________________________________________________
This is the author's manuscript of the article published in final edited form as:
Hovde, M. R. (2015). Effective User Experience in Online Technical Communication Courses: Employing Multiple Methods Within
Organizational Contexts to Assess Usability. In Proceedings of the 33rd Annual International Conference on the Design of Communication (p.
30:1–30:5). New York, NY, USA: ACM. http://doi.org/10.1145/2775441.2775453

and previous experience with the interface [2, p. 119). Designers
need to think about usability and educational aspects together when
shaping online course interfaces. [2, p. 118).
Fortunately, several people have begun address these challenges
through use of heuristics and usability testing, as well as other
means of determining the usability of an online course’s interface.
Each approach offers advantages as well as limitations; faculty
members need to consider their situational resources and
constraints as they explore multiple means of assessing usability.
Below, I discuss several approaches to assessing online course
usability, followed by a discussion of how faculty members can
apply these approaches within organizational constraints and
resources. Insights from my recent experiences follow, providing
concrete examples of considering both pedagogical concerns and
usability concerns in designing online course interfaces.
3. EMPLOYING HEURISTICS TO
EVALUATE THE USABILITY OF ONLINE
COURSE INTERFACES
Heuristics designed for online learning [1], [3] can inform design
decisions relevant to the user experience in online course. In
designing heuristics, practitioners need to develop “rigorous,
replicable principles for the design of e-learning environments and
instructional materials” [1, p. 2].
Some heuristics have a limited scope, limiting their usefulness for
assessing usability in an online course. For instance, the Quality
Matters Rubric, Fifth Edition, only lists a few standards to
consider in evaluating the usability of an online course. In a
section entitled “Accessibility and Usability,” the topics include
only general considerations including
“Course navigation facilitates ease of use.
Information is provided about the accessibility of all
technologies required in the course.
The course provides alternative means of access to
course materials in formats that meet the needs of
diverse learners.
The course design facilitates readability.
Course multimedia facilitate ease of use.[4].
While ease of use,” “accessibility, and “readability are
important components of usability, they do not cover the
multiplicity of features that make an online course usable. While
other sections of the QM rubric address the pedagogical nature of
online courses, a richer heuristic is needed to guide the design of a
usable LMS interface.
Nielsen’s [5] well-known and widely used heuristics for web sites
may not be adequate for providing guidance for designing the
interface of an online course. Nielsen himself calls them “broad
rules of thumb and not specific usability guidelines.” His
heuristics include
Visibility of system status
Match between system and the real world
User control and freedom
Consistency and standards
Error prevention
Recognition rather than recall
Flexibility and efficiency of use
Aesthetic and minimalist design
Help users recognize, diagnose, and recover from errors
Help and documentation” [5]
Despite the general nature of Nielsen’s heuristics, Shih, et al. [6]
use them as a basis for designing one aspect of an educational
interface. Granića and Ćukušić also utilize Nielsen’s heuristics as
one set of criteria (among four) employed by experts in HCI and
e-learning to evaluate the usability of an online LMS.
A richer list of usability heuristics, specifically tailored for
educational interfaces, is provided by Mehlenbacher, et al. [1],
providing guidance for designers and reviews of online courses.
Major categories include
“Learner Background and Knowledge
Accessibility
Customizability and maintainability
Error support and feedback
Navigability and user movement?
User control, error tolerance, and flexibility
Social Dynamics
Mutual goals and outcomes
Communication protocols
Instructional Content
Completeness
Examples and case studies
Readability and quality of writing
Relationship with real-world tasks
Interaction Display
Aesthetic appeal
Consistency and layout
Typographic cues and structuring
Visibility of features and self-description
Instructor Activities
Authority and authenticity
Intimacy and presence
Environment and Tools
Help and support documentation
Metaphors and maps
Organization and information relevance
Reliability and functionality
[1, p. 7-9].
In the ten years since Mehlenbacher, et al. was published, the use
of mobile devices has expanded, so an updated version of their
rich and complex heuristics might consider the usability of the
online course in a mobile environment.
To be useful for evaluating the usability of online courses, heuristics
need to integrate usability principles with sound pedagogical
principles. At this time, there is no widely accepted set of principles,
but foundations have been laid for their development. However,
even the most effective expert heuristic analysis can provide only
part of the insights needed for assessing the usability of online
courses.
One limitation of relying solely on heuristics in designing
educational interfaces is that heuristics do not take into
consideration how learners decide to interact with the system in
order to learn. Another limitation is that instructors may place
poorly-considered content into an otherwise usable system. [2, p.
117.] Additional means are necessary for creating an optimally
usable online course. Approaches that are more direct, including
usability testing, can provide richer insights, as discussed next.
4. EMPLOYING USABILITY TESTING
AND OTHER DIRECT MEANS TO

EVALUATE THE USABILITY OF ONLINE
COURSE INTERFACES
Conducting usability-testing and other direct means can help
instructors evaluate specific, situated interface designs for their
usability [1], [6], bringing in dimensions that heuristic analysis
cannot address. In fact, in one case, “Although many interface
problems were identified by expert reviews, it was the user
testing that enabled us to determine which problems actually
impeded the users’ (students’ and teachers’) ability in successful
task completion” [2, p. 120].
Using multiple methods is a well-accepted approach for
determining usability; We assumed that the usability testing
complemented with inspections that rely upon experts judging
the interface compliance with recognized usability principles
along with considerations of educational perspective would
provide a more accurate evaluation” [2, p. 110]. Faculty members
with limited resources may need to select the approach or
approaches that best use those resources in assessing online course
usability.
Methods that Granića and Ćukušić employed in their usability
evaluations of a European LMS included questionnaires for users to
report their previous experience with IT and the LMS under study,
memory tests, attitude questionnaires and semi-structured
interviews after usability testing, and evaluators’ notebooks
recording task completion during end-user testing. [2, p. 110-111].
In their study, Granića and Ćukušić conducted task-based usability
testing on the LMS prototype with 47 teen-aged students and 23
teachers at 9 locations [2], thus leading to richer insights. However,
this extensive research may fall beyond the scope of the time and
funding that many faculty members can access.
In Shih et al.’s study [6], participants were asked to interact with
animated hierarchical maps in an LMS and complete a
questionnaire that asked about their opinions and reactions to that
feature of the interface. In addition, they completed a post-test to
assess their memory of the content of the hierarchical maps. These
approaches supplemented Shih’s group’s use of Nielsen’s heuristics
in the design of the hierarchical maps to evaluate the usability of
this feature of the LMS.
Usability testing, however, suffers from the limitation that it
consumes a good amount of time and resources in planning,
administering, and analyzing results. I argue, however, that it is time
well invested. Even minimal testing can provide results that will
benefit designers of courses within an LMS.
Additional direct means such as student feedback or online
analytics can help course designers understand features of online
course that lead to enhanced usability and learning. Student
feedback can provide insights from actual users of the LMS,
employing it in a variety of circumstances and in the contexts of
actual learning goals and activities. However, relying solely on
this feedback means that faculty members are gaining insights
after the fact suggestions for improvement might apply to future
versions of the course, but problems will be likely to remain in the
current course and may inhibit student learning and satisfaction.
Online analytics can also provide useful insights. Many LMSs
today have built-in options for viewing usage statistics which may
prove useful to faculty members. However, one limitation is that
these statistics provide only limited information about user
behavior. For instance, faculty members may be able to see that a
student spent 20 minutes on a quiz that should have taken only 10
minutes, but the reason for the extended time is unknown. Perhaps
the student did not know the material, perhaps the student
encountered technical difficulties, perhaps the student was
interrupted, or perhaps the quiz was not well designed reasons
are not available through only studying the usage statistics. Hence
a variety of direct approaches is advisable for gaining rich
insights.
Despite the limitations of direct means of assessing usability,
faculty members would be wise to combine the heuristic
approaches discussed in the previous section with the direct
means discussed in this section. In doing so, they need to consider
their unique contextual constraints and resources as they attempt
to assess the effectiveness of online course usability and
pedagogical soundness, as discussed in the next section.
5. CONSIDERING CONTEXTUAL
CONSTRAINTS AND RESOURCES WHEN
WORKING TO ENHANCE ONLINE
COURSE USABILITY
Ideally, one would begin usability evaluations and/or testing early
in the process of designing online course interfaces and continue
throughout, [2][6] but that ideal is seldom achieved, often owing
to situational constraints.
Within educational contexts, technical communication faculty
members need to consider the feasibility of completing processes
that may lead to greater usability of online courses. Given that
faculty members often face limited resources of time, technical
background, freedom to alter the interface, etc., they need to
consider what is possible within organizational constraints. At
times, they may take small steps, and at other times, they may be
able to take larger ones to improve usability.
Overall, technical communication faculty members are in a
position to understand usability and to collect data that support
arguments for improved usability of online courses. Specifically,
faculty members need to consider administrative and political
constraints and resources which will vary across institutions.
5.1 Utilizing Resources.
Carrying out extensive heuristic analysis and/or more direct
usability measures requires access to appropriate resources. For
instance, Granića and Ćukušić’s far-reaching study was funded by
their national government’s educational agency, and Shih, et al.’s
study was funded by their National Science Council. One source
of support for the time and energy needed to assess usability can
be external grants, but another source can be internal grants. For
instance, I have been involved in two campus-level Curriculum
Enhancement Grants funded by the Center for Teaching and
Learning on my campus that not only allowed me to develop new
online courses, but to conduct small-
scale efforts toward making
th
em as usable as possible.
In a context of limited resources, faculty members can also create
class assignments for technical communication students that asks
them to apply heuristic analysis and/or more direct approaches to
assess the usability of the LMS they are using. The assignment
can aid students in learning more about usability and its
assessment, and the results can provide the faculty members with
insights useful in revising future online courses.
Institutions that are interested in expanding online learning would
be wise to invest resources in offering high quality courses that
lead to their long-term success, and technical communication
faculty members can base their arguments on this goal when
advocating for resources that will provide feedback that can lead

improved usability in an LMS interface. Faculty members would
be wise to seek out institutional resources, however limited, to
assist them in making initial steps toward improving usability.
5.2 Addressing Constraints
Even with good access to resources, faculty members may face
multiple constraints when they try to improve the usability of
online courses. However, with careful thought and planning,
many of these constraints can be addressed.
One constraint is that the LMS may be designed by someone else
and may not offer good usability or an appropriate pedagogical
approach within its interface. Technical communication faculty
members using an LMS designed by someone else may not have
interfaces that provide optimal usability [8] [9].
Additionally, unique courses or disciplinary circumstances may
require pedagogical approaches that interface designers did not
build in to the LMS. For instance, the LMS may have been
designed for a course that emphasized “attendance and re-
presentation of lecture notes rather than interaction, peer review,
and authorship” [9, p. 2], the latter being features of many
technical communication courses.
While no single approach to addressing constraints fits every
situation, technical communication faculty members would do
well to persist in finding ways to address these constraints. One
approach that I recommend is to attempt to influence the design of
the LMS early in the process if possible in order to enhance
usability. Another is to provide feedback to the LMS designers in
whatever form is available. For instance, our institution recently
began offering Instructure’s Canvas to replace the older LMS that
we had been using for many years. Fortunately, Canvas has a
means by which users can submit suggestions for change and can
vote on other people’s suggestions for change. Ideally, if a
suggested change has many votes, the designers will attempt to
implement the change.
In situations of limited resources, employing heuristic approaches
often consumes less time and energy than more direct measures.
While heuristicsresults may be less useful than results from more
direct measures, it may be necessary to begin with heuristics and
make modifications to an online course based on the results, all
the while creating arguments for more extensive resources that
could provide measures that are more direct and thus yield richer
insights into usability.
Overall, I encourage technical communication faculty members to
utilize their persuasive skills and knowledge of usability to
encourage improvements in existing LMS. And while there may
be high-level matters that the faculty members do not have the
authority to alter, they can work within the existing design to
make the courses as task-oriented as well as straightforward as
possible for the benefit of students.
For instance, on a small scale, Mehlenbacher, et al. [1]
recommend that the writing style of instructional material in
online course consider these standards: “Is the text in active voice
and concisely written (> 4 < 15 words/sentence)? Are terms
consistently plural, verb+object or noun+verb, etc., avoiding
unnecessarily redundant words? Can users understand the content
of the information presented easily? [1, p. 8]. Technical
communication faculty members have the ability to write
materials that meet these standards within the larger design of the
interface. And even a simple change such as wording can play an
important role in enhancing usability in a situation in which
faculty members are constrained by lack of access to making
larger changes.
Realistically, educational contexts provide many constraints that
limit the potential effectiveness and usability in online courses.
However, it is the responsibility of faculty member to work within
those constraints and to advocate for better situations whenever
possible.
6. EXAMPLES FROM RECENT
EXPERIENCES IN DESIGNING USABLE
AND PEDAGOGICALLY SOUND
COURSES
Many of these principles came into play as I worked on designing
new online courses, transforming existing courses into online
versions. I was aware that many of the pedagogical principles for
face-to-face education might apply to an online situation, but that
several would not. Space limitations prevent me from providing
an extensive list, but the examples below may inspire similar
activity from other faculty members within their contexts.
In designing courses to be pedagogically sound and to provide as
usable an interface as possible, I considered several usability and
pedagogical principles and practices, including:
Simplicity: I understood that students in an online course would
need a great deal of deliberate simplicity in the content and the
wording, consistent with Mehlenbacher, et al’s advice. Within the
Canvas LMS, I made wording as simple and direct as possible. I
aimed not to have a chatty style. In addition, I tried to make each
assignment step begin with a verb so that students would know
what they needed to do to complete an assignment.
In addition, Canvas provides many tools that faculty members
might use in courses, but I included only the most essential tools
in the navigation bar and hid less common tools from students in
order not to overwhelm them with too much information. Canvas
also provides a Modules tool that allows a faculty member to put
assignments, discussions, quizzes, resources, etc. in one place in
sequential order, so I used that tool to help students work through
steps in order. Even though incorporating various assignments and
activities into one module was more work for me, it reduced the
number of places that students would have to visit, thus providing
greater simplicity.
Providing a sound communication-creation process: Consistent
with a process-based approach to teaching technical
communication, I designed course modules so that a series of
planning steps allowed students to make decisions about a major
project throughout a carefully-thought-out process. Steps also
provided students with opportunities to receive feedback from me
and from classmates so that they could adjust the directions of
final deliverables.
Teaching communication processes in face-to-face technical
communication courses may be common, but in an online course,
multiple steps appearing in the interface of the LMS may
overwhelm students or give them the impression that there is a lot
of “busy work” in the course. As much as possible, I tried to
provide a rationale for the steps and to streamline them to meet
pedagogical goals and usability goals simultaneously.
Help and documentation: Consistent with Nielsen’s principle that
users should have useful help and documentation, I considered
that students new to an online course need orientation to the LMS
interface and to the course principles. (I recently conversed with a
faculty member who thought his students were “stupid” for not
figuring out how where to find course materials, but he had not
provided any orientation to them.) Even if students have used the

LMS in a previous course, the way I use it might differ from their
previous experiences. To address that issue, at the start of the
course I provide time for students to complete activities that will
help them become familiar with the course objectives and the
LMS interface. Fortunately, Canvas provides several good
tutorials about using Canvas, so I was able to point may students
towards those resources.
Even with an orientation at the start of a course, it may still be
necessary to provide ongoing orientation and even more explicit
wording. For instance, at the start of each module, I provide an
item titled “Overview of Module X that explains the module.
This item is followed by relevant items that guide students’
learning activities. About halfway through a recent course, I
learned that students had not read the overview and did not
understand the assignment as they were completing the steps. One
student reported, “I didn’t know we had to read the overview.”
Hence, I now begin the title of that item with a verb, “Read the
overview of Module X, in order to encourage students to read
each overview before completing the steps.
Using direct and indirect means for assessing usability: Although
I have not yet been able to conduct direct usability testing of the
online courses I’ve designed, I’ve been able to use several other
indirect and direct means to assess the usability of the courses.
I have participated in the Quality Matters training to learn about
general principles for effective online courses and have applied
those principles to several of my courses. For instance, QM rubric
item 7.1 states “The course instructions articulate or link to a clear
description of the technical support offered and how to obtain it,”
so I have included such information in my course syllabus and
other relevant materials for the benefit of students who may need
that technical support in order to complete the course successfully.
I’ve also applied other general principles of effective pedagogy
and effective usability to make the course as usable as possible
within my current circumstances.
The most effective direct means I have been able to employ
comes from student feedback about the design of the course. For
instance, in a course that I offered a year ago, the students
commented that a reading quiz on each textbook chapter in an
online course was too much, given the other work they needed to
complete for the course. I modified the course the next time it was
offered to include fewer quizzes but to include several chapters in
each quiz. This approach seemed to be more manageable for
students.
I anticipate using additional indirect and direct means to assess the
pedagogy and usability of my online courses as appropriate
resources become available within my context.
7. CONCLUSION
In creating a usable online course, faculty members design
interfaces for both novices and experienced users. In this design,
faculty members cannot rely solely on intuition but should employ
heuristics and more direct measures to assess course usability.
In using multiple approaches, technical communication faculty
member should consider that
we should not rely on isolated evaluations, and expert
reviews are not yet a substitute for end-user testing.
Actually, those are complementary approaches. Users
are oriented toward tasks accomplishment and
subjective look and feel of the system design, and
hence the results achieved through user testing are
appropriate for identification of general usability
problems. On the other hand, experts go deeply into the
structure trying to identify problems that influence
system functions. Therefore, inspection provides a more
precise detection of usability setbacks and at the
same time offers suggestions for possible solutions
[2, p. 120].
When using multiple approaches within organizational contexts,
technical communication faculty members should keep in mind the
value of both heuristic analysis and direct approaches because “a
usable e-learning system is not just a resource with a nice ‘look &
feel’, but an application that communicates content and
structures the interaction in a way that facilitates the learning
experience” [2, p. 119].
8. REFERENCES
[1] Mehlenbacher, B., Bennett, L., Bird, T., Ivey, M., Lucas, J.,
Morton, J., and Whitman, L. 2005. Usable e-learning: A
conceptual model for evaluation and design. In Proceedings
of HCI International 2005: 11th International Conference on
Human-Computer Interaction, Volume 4 Theories,
Models, and Processes in HCI. (Las Vegas, NV) Mira
Digital (1-10.)
[2] Granića, A., and Ćukušić, M. 2011. Usability testing and
expert inspections complemented by educational
evaluation: A case study of an e-learning platform.
Educational Technology & Society, 14 (2),
107
123.
[3] Ballard, J. K. 2010. Web site usability: A case study of
student perceptions of educational web sites. Dissertation.
University of Minnesota. UMI Number: 3408366
[4] Quality Matters. (2014). Standards from the QM higher
education rubric, fifth edition. www.qualitymatters.org
[5] Nielsen, J. 2005. 10 usability heuristics for user interface
design. http://www.nngroup.com/articles/ten-usability-
heuristics/
[6] Shih, Y.-C., Huang, P.-R., & Chen, S.-Y. 2013.
Incorporating usability criteria into the development of
animated hierarchical maps. Educational Technology &
Society, 16 (1), 342355.
[7] Bernhardt, S.A. 2013. Developing a web-served handbook
for writers. In George Pullman and Baotong Gu (Eds.)
Designing Web-based Applications for 21st Century Writing
Classrooms. Baywood: Amityville, N.Y.
[8] Staggers, J., Zoetewey, M.W., and Pennell, M. 2009.
Learning within limits: New faculty and course management
systems. In George Pullman and Baotong Gu (Eds.) Content
Management: Bridging the Gap between Theory and
Practice. Baywood: Amityville, NY.
[9] Pullman, G. and Gu, Baotong. 2013. Introduction in George
Pullman and Baotong Gu (Eds.) Designing Web-based
Applications for 21st Century Writing Classrooms.
Baywood: Amityville, NY.
Citations
More filters
Book ChapterDOI

Guidelines to Evaluate the Usability and User Experience of Learning Support Platforms: A Systematic Review

TL;DR: The purpose of this study is to establish the basis towards the elaboration of a future framework to quantify the level of usability and user experience of learning support platforms.
Dissertation

The impact of an aesthetic online course design template on the learner user experience : a thesis presented in partial fulfilment of the requirements for the degree of Master of Education at Massey University, Manawatu, New Zealand

TL;DR: This chapter presents a meta-anatomy of the immune system and some of the mechanisms that protect against disease and provide clues to the mechanisms leading to cell death.
Proceedings ArticleDOI

Applying Design Science Research to develop a Technique to Evaluate the Usability and User eXperience of Learning Management Systems

TL;DR: By applying DSR, a high level of rigor is ensured in the development process, combining theory and practice, and the empirical studies indicated that TUXEL is adequate, in addition to providing better results compared to other techniques.
Journal ArticleDOI

Game Design Tactics for Teaching Technical Communication in Online Courses

TL;DR: Three tactics that are useful for teaching technical communication students in hybrid and fully online courses are proposed: nonlinear association for creative thinking; team-based assignments for writing and editing using game-based tools; and iterative prototyping and playtesting for multimodal production.
References
More filters
Journal Article

Usability Testing and Expert Inspections Complemented by Educational Evaluation: A Case Study of an e-Learning Platform

TL;DR: It is expected that this contribution with its general findings and know-how from the experience will facilitate the understanding on how to evaluate and improve the usability of e-learning systems based on users' and teachers' and experts' feedback.
Dissertation

Web site usability: A case study of student perceptions of educational web sites

TL;DR: This dissertation aims to provide a history of English as a Curriculum and Instruction in the United States from 1989 to 2002, a period chosen in order to explore its roots as well as specific cases up to and including the year in which the curriculum was introduced.
Journal Article

Incorporating Usability Criteria into the Development of Animated Hierarchical Maps

TL;DR: The results show that the proposed animated hierarchical map is generally better than the traditional hierarchical map but the effects on individual learning is limited.
Related Papers (5)
Frequently Asked Questions (1)
Q1. What are the contributions mentioned in the paper "Effective user experience in online technical communication courses: employing multiple methods within organizational contexts to assess usability" ?

In this paper, the authors present an approach to assess the usability and pedagogical effectiveness of online technical communication courses by using heuristics and more direct measures.