Get it wrong and your presentation will command the same level of interest as a family holiday slide show. Want to try using visual art to pep up your message? Use our best tips and techniques to create presentations that will be remembered for all the right reasons. Novices and nervous presenters often make one of two mistakes: employing visual aids as a crutch, using them to merely echo phrases and powerful points from their spoken sections; or using them as a script, that they spend the entire presentation reading from.
Remember that visuals should be used to complement and enhance your messages, rather than repeat them. Detailed figures and percentages are incredibly difficult for any audience to absorb, and this is one area where visuals absolutely make the difference in conveying your message. Use a data handling program like Excel to convert data into pie charts and graphs to visually deploy when you need to back up a claim.
You can also compare before and after figures to demonstrate the success of an activity or the results of an experiment. Information is easier to demonstrate when its displayed as a contrast. Show the improvements in a healthcare program, display where a business is losing customers, outline the flow of history to students — the possibilities are endless. Film clips can be the perfect way to illustrate metaphors, stories, or to reflect human behaviour as part of your presentation.
Using a short section of film or video can also allow you to take a break and forms an excellent way of segueing through to a new section of the presentation. Create contrast and allow video to do the hard work of getting a complex point across. Struggling to explain a concept? If so, you may be eligible to participate in a 45 minute long online study. In this study, you will watch professional presentations over Skype from home on your personal computer. Anyone who responded to the recruitment notice was eligible, provided that they were available during one of the prescheduled testing sessions.
Table 2 presents demographic information for the presenter and audience participants. Presenter participants completed a survey remotely before attending the in-person, group sessions with other participants. In the online pre-survey, presenters first answered basic demographic questions gender, age, education level, English fluency, and occupation. Next, they answered questions about their prior experience with, opinions about, and understanding of the different presentation formats oral, Prezi, and PowerPoint. This section was prefaced with the following note:.
A note on language: When we use the term "presentation," we mean a formal, planned, and oral presentation of any duration, including a public speech, an academic lecture, a webinar, a class presentation, a wedding toast, a sermon, a product demonstration, a business presentation, and so on.
Examples of things we do NOT mean are: a theatrical performance, an impromptu toast at dinner, and any presentation with no audience. When we say PowerPoint presentations, we mean presentations that were made using Microsoft PowerPoint, not other software such as Apple's Keynote. When we say Prezi presentations, we mean presentations that were made using Prezi presentation software.
Also, when we refer to "oral presentation", we mean a presentation that is only spoken and does not include any visual aids or the use of presentation software. As part of the expertise-related measures, we also asked the participants to identify the purported advantages and disadvantages of each presentation format, according to its proponents and critics, respectively. For PowerPoint and Prezi, we asked participants to identify whether or not it had particular functionalities e. Finally, participants viewed three sets of four short Prezi presentations and rank-ordered them from best to worst.
In each set we manipulated a key dimension of Prezi effectiveness, according to its designers: the use of zooming, the connection of ideas, and the use of visual metaphor. Presenter participants were tested in person at the Harvard Decision Science Lab, and randomly assigned to one of the three groups: Prezi, PowerPoint, or oral presentation.
A total of 50 data collection sessions were held. In each session, there were typically three presenter participants one for each presentation format ; as a result of participants who failed to arrive or overbooking, there were ten sessions with only two presenters and six sessions with four presenters. After providing informed consent, participants completed an online survey in the lab in which they rank-ordered three sets of recorded example PowerPoint and oral presentations.
Identical in form to the example Prezi presentations they judged in the pre-survey, these short presentations were designed to assess their understanding of effective presentation design by manipulating a key aspect specific to each format. In selecting these dimensions and those for Prezi , we consulted with a variety of experts, including software designers, speaking coaches, and researchers.
Next, presenters were shown material from a multimedia case created for and used by the Harvard Business School. For the next two hours, you are going to pretend to be the chief marketing officer of i-Mart, a large chain of retail stores. As a participant in this study, your primary job today is to prepare and then deliver this presentation.
The presentation will be very short less than 5 minutes and made live via Skype to an audience of participants who are playing the part of [Company X] executives. On their own computer workstation, participants studied the multimedia case for 30 minutes and were invited to take notes on blank paper provided for them. Following this study period, participants were given 45 minutes to create a presentation in one of three randomly assigned presentation formats: PowerPoint, Prezi, or oral.
To assist participants in the PowerPoint and Prezi conditions, we provided them with a set of digital artifacts including text, data, and graphics related to the case. Participants were not told that other participants were asked to present in different formats, and the workstations were separated from each other to prevent participants from discovering this manipulation. After this preparation period, participants were taken individually in a counterbalanced order to another room to present to a live audience via Skype. For those presenters who consented, we also recorded their presentations for future research purposes.
After making their presentations, presenters completed a final survey about their presentation e. Audience participants completed the entire experiment remotely and online. Their participation was scheduled for the end of the presenter sessions so that the in-lab presenters could present live to a remote audience via Skype.
We recruited between three and six audience participants per session, although participants who failed to arrive or Skype connectivity issues resulted in some sessions with only one or two audience participants: Five sessions had one participant, twelve sessions had two participants, sixteen sessions had three participants, eleven sessions had four participants, four sessions had five participants, and two sessions had six participants.
Individuals who responded to the recruitment notice completed a consent form and three online surveys prior to their scheduled Skype session. The first survey was a slightly modified form of the presenter pre-survey demographics, background on presentation formats, rank-ordering of example Prezis in which they also scheduled their Skype session. You are a board member for [Company X], an innovative clothing company. You and your fellow board members must decide whether or not to accept i-Mart's offer. And in the third survey they rank-ordered the three sets of recorded example PowerPoint and oral presentations.
At the time of the scheduled session, the audience participants logged into Skype using a generic account provided by the research team, and were instructed to turn on their webcams and put on headphones. Immediately after viewing each presentation, participants evaluated it via an online survey.
They rated each presentation on how organized, engaging, realistic, persuasive, and effective it was using a five-level scale with response options of not at all , slightly , somewhat , very , and extremely. They were also invited to offer feedback to the presenter on how the presentation could be improved. After the final presentation, participants rank-ordered the presentations on the same dimensions e. If presenter participants had more experience with and more positive beliefs about one format than the others—and those assigned to that format induced more positive assessments from the audience members than did those assigned to the other formats—then the results are less compelling than if there was no correlation between these baseline measures and the experimental outcomes.
The same applies to audience participants: Are they merely judging presentations according to their initial biases? Conversely, the results are most compelling if there is a negative association between the baseline measures and the experimental findings. Both audience and presenter participants were least experienced with Prezi and most experienced with oral presentations. At the outset, they rated PowerPoint as the most effective and easiest to use to present material and Prezi as the least effective and most difficult to use to present.
For watching presentations, audience participants rated PowerPoint most effective and oral presentations least effective, but rated Prezi as more enjoyable than other formats. For watching presentations, presenter participants did not find any format more effective than the others.
Table 3 presents full descriptive and inferential statistics for all self-reported measures of prior experience with and preexisting beliefs about Prezi, PowerPoint, and oral presentations. Presenters assigned to different formats did not differ in their experience with or pre-existing beliefs about presentations formats. They also did not differ in how well they identified the purported advantages and disadvantages of each presentation format, how well they identified the software features of PowerPoint and Prezi, or how accurately they could identify effective presentations of each format.
In term of their prior experience with and pre-existing beliefs about presentation formats, both audience and presenter participants were biased in favor of oral and PowerPoint presentations and against Prezi. After presenters were randomly assigned to these different formats, how did the audience evaluate their presentations? First, sessions with two presentations were missing one presentation format, and sessions with four presentations had two presentations of the same format.
To address this complexity we only conducted pairwise comparisons of different formats e. To be certain that the differing number of presentations per session did not somehow bias the results even after adopting these measures, we also conducted an analysis on the subset of sessions that had exactly three presentations. Second, the number of audience participants per session ranged from one to six. In calculating descriptive statistics, some sessions would be weighted more heavily than others unless ratings were first averaged across participants within the same session, then averaged across sessions.
In calculating inferential statistics, averaging across ratings from different participants within the same session who received presentations in the same format was necessary to ensure that the sampling units were independent of each other, an assumption of all parametric and most nonparametric tests. In other words, for both descriptive and inferential statistics, we treated session instead of participant as the sampling unit. As an empirical matter, this multi-step averaging—within participants across identical presentation formats, then across participants within the same session—had little impact on the condition means i.
Compared to the simplest, raw averaging of all ratings in one step, the maximum absolute difference between these two sets of means was. Because we conducted three tests for each dimension—pairing each format with every other—we controlled for multiple comparisons by dividing our significance threshold by the same factor i.
Results revealed that presentation format influenced audience ratings. In particular, the audience rated Prezi presentations as significantly more organized, engaging, persuasive, and effective than both PowerPoint and oral presentations; on a five-level scale, the average participant rated Prezi presentations over half a level higher than other presentations. The audience did not rate PowerPoint presentations differently than oral presentations on any dimension.
Table 4 and Fig 1 present these results. The figure shows session-level means from all available data, including those from sessions with two or four presentations. By limiting the analysis to the 34 sessions with exactly three presentations one of each format , we could ensure that the sessions with two or four presentations did not somehow bias the results. Moreover, this procedure enabled us to conduct omnibus tests of presentation format for each rating dimension. Note: All p -values for pairwise tests here and elsewhere are two-tailed. To explore whether the obtained results were somehow the result of demand characteristics, we analyzed ratings from only the first presentation in each session.
This analysis yielded the same pattern of findings, with a to-be-expected reduction in statistical significance due to the loss of power. As just noted, participants randomly assigned to present using Prezi were rated as giving more organized, engaging, persuasive, and effective presentations compared to those randomly assigned to the PowerPoint or oral presentation conditions. In addition, at the end of each session audience participants rank-ordered each type of presentation on the same dimensions used for the ratings.
The same complexities with the ratings data—the variable number of conditions and audience participants per session—applied as well to the ranking data. We therefore adopted a similar analytic strategy, with one exception: we conducted non-parametric rather than parametric pairwise tests, given the rank-ordered nature of the raw data and distributional assumptions that underlie parametric tests.
Using the session-level mean ranks, we tested the effect of presentation format with three sets of Wilcoxon signed-rank tests. Table 5 and Fig 2 present these results. Audience members ranked the presentations from best to worst, with lower ranks indicating better presentations. As with the ratings data, we also conducted omnibus tests of only those sessions with exactly three presentations to validate that unbalanced sessions did not somehow bias the results.
Before and after the experimental session, audience participants judged the general effectiveness of the three presentation formats. In the pre-survey, they rated each format on its effectiveness for them as presenters and audience members. As already described see Table 3 , the audience began the experiment judging PowerPoint presentations as most effective for presenters and audiences. Fig 3 presents these results. Note: Means shown from pre-survey items are calculated based on responses from all participants as opposed to only those who had experience with all presentation formats.
In the pre-survey, some audience participants reported prior experience viewing Prezi presentations but others did not i. Thus, prior experience with Prezi was associated with negative pre-existing judgments of PowerPoint. If, for example, the more experience the audience had with Prezi, the worse they evaluated those presentations, such a correlation would suggest that the current findings reflect a novelty effect. Although we cannot assume that participants understood the reasons behind their rank-orderings cf.
An equal percentage explained their choice in terms of negative judgments of Prezi, including comments that Prezi was disorienting , busy , crowded , amateurish , or overwhelming. Presenter variables—including demographic characteristics and experience with their assigned format—generally did not predict their presentation success, either in terms of audience ratings or rankings. The one exception was that Prezi presenters who were better able to identify effective Prezi presentations were rated and ranked as giving more effective and engaging presentations,.
Participants who were randomly assigned to present using Prezi were judged as giving more effective, organized, engaging, and persuasive presentations than those who were randomly assigned to present orally or with PowerPoint. This was true despite the fact that both audience and presenter participants were initially predisposed against Prezi.
What might explain these findings? One explanation is a novelty effect: Perhaps the audience preferred Prezi simply because it is relatively new to them. Another explanation for these results is that the presenters or audience members were somehow biased towards the Prezi presentations. Again, however, this appears not to be the case. The presenters were least experienced in Prezi, judged themselves least effective presenting with Prezi, and found Prezi presentations hardest to create.
All presenters were randomly assigned to their presentation format and were blind to the experimental manipulation. In recruiting audience participants, we did not mention Prezi or PowerPoint, and selected participants only based on their access to Skype and a sufficiently large computer screen. In addition, we minimized contact between the investigator and research participants, and presentations were never identified based on their format; at the end of the experiment, in fact, some participants did not even realize that they had seen a Prezi presentation as evidenced by their free responses.
Data were collected through standardized, online surveys, the investigator was not in the room with the presenter during his or her presentation, and the investigator interacted with the audience only briefly to set up their Skype session. Finally, an analysis of ratings from only the first presentations yielded the same results as the full analysis, making implausible an interpretation based on audience demand characteristics. Thus, the most likely explanation is that individuals do, in fact, perceive Prezi presentations more favorably than PowerPoint or oral presentation.
Experiment 1 has several limitations, however. In other words, Experiment 1 demonstrated that Prezi presentations are more effective than other formats in terms of audience perceptions but not decision-making outcomes. Second, we asked the audience about their pre-existing beliefs and prior experiences with PowerPoint, Prezi, and oral presentations at the beginning of the Experiment 1; although it is difficult to imagine how this questioning could have produced the obtained results—particularly given the nature of their pre-existing beliefs and prior experiments—it is a remote possibility.
Third, just like the results from any single experiment, the findings of Experiment 1 should be treated cautiously until replicated. We designed a second experiment to address these limitations and extend the findings from the first experiment. In Experiment 2 we showed online participants a single presentation from Experiment 1, and varied randomly which type of presentation Prezi, PowerPoint, or oral they viewed.
We also randomly assigned some participants to view a presentation on material that was not related to the case material; this control condition served as a baseline that allowed us to estimate the impact of each presentation format. To minimize demand characteristics, we asked participants about their experiences with different presentation formats at the conclusion of the experiment instead of the beginning , and did not expose participants to multiple presentation formats.
Excluding pilot participants who offered us initial feedback on the survey and protocol, individuals consented to and began the experiment. The number of excluded participants did not covary with group assignment or demographic variables. Table 6 presents demographic information on the included participants.
The main stimuli for this experiment consisted of recorded presentations from Experiment 1. For Prezi and PowerPoint presentations, these were split-screen videos showing the presenter on one side of the screen and the visuals on the other side. For the oral presentations, these were simply audiovisual recordings of the presenter.
Of the presenter participants from Experiment 1, 33 either did not consent to being video-recorded or were not recorded due to technical difficulties. We therefore had a pool of presentation videos to use for Experiment 2: 41 from the Prezi condition out of a possible 50 , 40 from the PowerPoint condition out of possible 49 , and 32 from the oral presentation condition out of a possible Some of the recorded presentations from Experiment 1 were unusable because of intractable quality issues e.
We randomly selected 25 videos in each format, resulting in a total pool of 75 videos. Because of a URL typo that was not detected until after testing, one PowerPoint video was not presented and participants assigned that video were not able to complete the experiment. We were concerned that we could have, perhaps unconsciously, selected better stimuli in the Prezi condition, which would have biased the results.
- Sharing Success - Owning Failure: Preparing to Command in the Twenty-First Century Air Force;
- Who Dares Wins: SAS Military Thriller.
- 7 Ways to Structure Your Presentation to Keep Your Audience Wanting More?
To ensure that our judgments of major audiovisual problems and subsequent exclusion of some videos were not biased, we recruited a separate group of participants to rate the audiovisual quality of the presentation videos. In this study you will judge the technical quality of three short videos. To participate you must have a high-speed Internet connection. These participants were totally blind to the experimental hypotheses and manipulation.
They completed the audiovisual rating task completely online via the Qualtrics survey platform, and were given the following instructions:. We need your help in determining the audiovisual quality of some Skype presentations we recorded. We want to know which presentations we can use for additional research, and which need to be eliminated due to major technical problems with the recordings.
You will watch a single presentation video. Please ignore any aspect of the recording other than its audiovisual quality. The only thing we care about is whether the audio and video were recorded properly. Finally, please keep in mind that because these videos were recorded through Skype, even the best recordings are not very high quality.
To address any possibility of experimenter bias—which seemed unlikely, given that we designed the procedure from the outset to guard against such effects—we conducted a series of Presentation Format Prezi, PowerPoint, oral x Quality Judgment inclusion, exclusion ANOVAs to test 1 whether audiovisual quality was for any reason confounded with presentation format i. We conducted the ANOVAs on the three measures of audiovisual quality collected from the independent judges: ratings of audio quality, ratings of video quality, and judgments of major audiovisual problems.
In other words, presentation format was not confounded with audiovisual quality, our judgments of quality corresponded to those of blind judges, and our exclusion of videos was unrelated to presentation format. Participants completed the experiment entirely online through Qualtrics. After providing informed consent, and answering preliminary demographic and background questions e. In this part of the study, you are going to play the role of a corporate executive for [Company X], an innovative clothing company.
You must decide whether or not to accept i-Mart's offer. To help you make your decision, we will first provide you with some background on [Company X] and the i-Mart offer. Please review this background material carefully. This material was an abridged version of what Experiment 1 presenter participants studied, but an expanded version of what Experiment 1 audience participants studied. Participants randomly assigned to the Prezi, PowerPoint, and Oral Presentation conditions were then told the following:. Now that you know a little bit about the company, you will watch a video presentation from another research participant.
In this presentation, he or she will try to convince you and your fellow [Company X] executives to accept i-Mart's offer. Because this presentation is from another research participant playing the role of an i-Mart executive--and not an actual i-Mart executive--please disregard the presenter's appearance clothing, age, etc. And because we did not professionally videorecord the presentation, please also try to disregard the relatively poor quality of the video compared to the videos you just viewed. The purpose of this research is to understand what makes presentations effective. So please listen carefully and do your best to imagine that this is "real".
Participants in the Prezi and PowerPoint groups were asked three additional questions. First, they were asked to rate the visual component of the presentation i. And finally, there were asked to comment on the visual component of the presentations, including ways in which it could be improved. All participants then summarized the presentation in their own words, with a minimum acceptable length of 50 characters.
In addition, we asked participants a series of recall and comprehension questions about the case. Finally, and after answering all questions about the business case and presentation, participants answered background questions about their experience with, knowledge of, and general preference for different presentation formats.
They also rank-ordered the mini examples of Prezi, PowerPoint, and oral presentations in terms of their effectiveness. These background questions and tasks were the same as those used in Experiment 1. Participants in the control condition completed the same protocol, with a few exceptions: First, instead of being shown presentations from Experiment 1, they viewed one of three instructional videos matched for length with the Experiment 1 presentations.
And finally, they did not complete the final set of background questions on the different presentation formats or rank-order the example presentations. At the outset, participants rated oral and PowerPoint presentations as equally effective in general, and Prezi presentations as less effective than the other two formats. Just as we found in Experiment 1, participants rated themselves as more experienced and effective in making and oral and PowerPoint presentations compared to Prezi presentations.
They also rated oral and PowerPoint presentations as more enjoyable and effective for them than viewing Prezi presentations. When asked how difficult it was to make the different types of presentations, they rated Prezi as more difficult than oral and PowerPoint presentations, and oral presentations as more difficult than PowerPoint ones.
In terms of the number of presentations watched in the last year and in their lifetime—as well as the number of years of experience—they reported more experience watching oral compared to PowerPoint presentations, and more experience watching PowerPoint than watching Prezi presentations. The same pattern was true for their reported experience in making presentations, with one exception: They reported making more PowerPoint than oral presentations in their lifetime. Table 7 presents full descriptive and inference statistics for all self-reported measures of prior experience with and preexisting beliefs about Prezi, PowerPoint, and oral presentations.
The experimental groups did not differ significantly on any of these variables.
For overall judgments of the presentations, participants rated Prezi as more organized, effective, engaging, and persuasive than PowerPoint and oral presentations, and rated PowerPoint no differently than oral presentations. They also rated Prezi presenters as more organized, knowledgeable, effective, and professional than PowerPoint presenters and oral presenters; Prezi presenters were not rated differently from other presentations on how nervous, boring, enthusiastic, confident, persuasive, or engaging they were, and PowerPoint presenters were rated no differently than oral presenters on all dimensions.
In judging the visual components of the Prezi and PowerPoint presentations, the audience rated Prezi presentations as more dynamic, visually compelling, and distinctive than PowerPoint slides, and marginally more effective and persuasive. Examining the magnitude of mean differences, some effects are clearly larger than others. Most notably, Prezi presentations are rated as most organized and visually dynamic, and Prezi presenters are rated as most organized.
Fig 4 and Table 8 present the descriptive and inferential statistics, respectively, for these audience ratings. Note: rating dimensions are ordered by the magnitude of the difference between Prezi and the other presentation formats; for dimensions with no significant differences between presentation formats, only the overall mean is displayed. There were no significant group differences on any of these variables.
In order to investigate the impact of presentation software on decision-making, we contrasted the Prezi and PowerPoint groups with the oral presentation groups. And this is indeed what we found. Excluding participants in the control group who did not make judgments about comparable presentations , those who rejected the i-Mart offer rated presentations as worse than those who accepted the i-Mart offer.
Using Slideware Effectively in Technical Presentations
These analyses produced qualitatively identical results, both in terms of decision-making as a function of group assignment and the correlation between decision-making and presentation ratings. Presentation length or recording quality as assessed by the independent judges did not correlate with presentation outcomes. Most notably, the better participants did on the rank-ordering PowerPoint task, the worse they rated PowerPoint but not Prezi presentations on visual dimensions; the same was true for the Prezi task and presentations. Thus, individuals with more expertise in PowerPoint and Prezi were more critical of PowerPoint and Prezi presentations, respectively.
Getting the Message Across
More participants indicated that there was not enough text, graphs, and animations in PowerPoint presentations than Prezi presentations, with animation as the most distinguishing attribute. Table 9 presents the descriptive and inferential statistics for these variables. This effect was particularly pronounced for judgments of graphs and text.
Participants who reported too much text also tended to reject the offer. Presenters and presentations were rated worse if they had too much or not enough text, and not enough graphs, images, and animations; in terms of audience decision-making, presentations were less effective if they contained too much or not enough text, or not enough graphs, animations, and images.
PowerPoint presentations were judged to have too little of all attributes, particularly animation. Replicating results from Experiment 1, participants rated presentations made with Prezi as more organized, engaging, persuasive, and effective than both PowerPoint and oral presentations. Extending the Experiment 1 results, participants also judged Prezi presentations as better in various ways e. In making decisions as corporate executives, participants were persuaded by the presentations.
Compared to the baseline decisions of the control group, those in the treatment group shifted their decisions by The non- or marginal significance of some between-format comparisons e. We hesitate to dismiss these differences as statistical noise given their general alignment with rating results, as well as the correlation between business decisions and presentation ratings which do vary significantly with format. For the more objective outcome of decision-making, we can, at the very least, provisionally conclude that Prezi presentations are more effective than oral presentations, and that software-aided presentations are more effective than oral presentations.
Given the goals of the presentations and design of the experiment, however, we hesitate to draw any conclusions from these null results. The most important finding across the two experiments is easy to summarize: Participants evaluated Prezi presentations as more organized, engaging, persuasive, and effective than both PowerPoint and oral presentations. This finding was true for both live and prerecorded presentations, when participants rated or ranked presentations, and when participants judged multiple presentations of different formats or only one presentation in isolation.
We have no evidence, however, that Prezi or PowerPoint or oral presentations facilitate learning in either presenters or their audience. Several uninteresting explanations exist for the observed Prezi effects, none of which posit any specific efficacy of Prezi or ZUIs in general: namely, novelty, bias, and experimenter effects. We consider each in turn. Novelty heavily influences both attention and memory [ 87 , 88 ], and the benefits of new media have sometimes dissipated over time—just as one would expect with novelty effects [ 3 ]. However, we found no evidence that novelty explains the observed benefits of Prezi: Participants who were less familiar with Prezi did not evaluate Prezi presentations more favorably, and only a small fraction of participants who favored Prezi explained their preference in terms of novelty.
We therefore are skeptical that mere novelty can explain the observed effects. We also considered the possibility that participants had a pre-existing bias for Prezi. In fact, both sets of participants entered the research with biases against Prezi, not for Prezi: They reported more experience with PowerPoint and oral presentations than Prezi, and perceived PowerPoint and oral presentations as more not less efficacious than Prezi.
Thus, we reject the idea that the results simply reflect pre-existing media biases. For many reasons, we also find it unlikely that experimenter effects—including demand characteristics i. First, at the outset we did not have strong hypotheses about the benefits of one format over the others. Second, the results are subtle in ways that neither we nor a demand characteristics hypothesis would predict: the effects on subjective experience diverged somewhat from the effects on decision-making, and there were no memory or comprehension effects.
Fourth, we ensured that the presentations were equally high-quality; we did not unconsciously select Prezi presentations that happened to be higher quality than presentations in the other formats. And finally, in Experiment 2 we only explicitly mentioned or asked participants questions about Prezi, PowerPoint, and oral presentations at the conclusion of the experiment, after collecting all key outcome data.
We therefore conclude that the observed effects are not confounds or biases, but instead reflect a true and specific benefit of Prezi over PowerPoint or, more generally, ZUIs over slideware. But if these effects instead reflect intrinsic properties of ZUIs or slideware, then they reveal more interesting and general insights about effective communication. Presenters were much more experienced in using PowerPoint than Prezi and rated PowerPoint as easier to use than Prezi. Finally, audience participants did not simply favor the Prezi presentations in an even, omnibus sense—they evaluated Prezi as better in particular ways that align with the purported advantages of ZUIs over slideware.
This pattern of finding makes most sense if the mechanism were at the level of media, not software. Taken together, this evidence suggests that Prezi presentations were not just better overall, but were better at engaging visually with their audience through the use of animation. Because ZUIs are defined by their panning and zooming animations—and animation is an ancillary and frequently misused feature of slideware—the most parsimonious explanation for the present results is in terms of ZUIs and slideware in general, not Prezi and PowerPoint in particular.
The medium is not the message, but it may be the mechanism. The animated nature of ZUIs makes more sense as possible mechanism for the observed effects when one considers relevant literature on animation. Past research has shown that animation can induce physiological and subjective arousal e.
This might visually inspire the cards in your storyboard. Image Source. You are now ready to move on to the final step: building your presentation with Visme. Okay, you can create it with whichever presentation software you like, but we think you'll conclude Visme is one of the best choices out there. First, open a new presentation canvas, then choose a template or start from scratch. When you start from a blank canvas in Visme, you can add pre-built slides one by one from the slide library. Create your slides by following the storyboard.
For an added bonus, you can use animations, videos and audio to make your presentation unique. Record your own audio and voiceovers within Visme. If your presentation is meant to be seen on its own, online or sent as a scrollable PDF, there might need to be more text than on a visual presentation which accompanies a speech. You can try animating the text so it's not just a big block of words. Using audio also helps, but if the viewer has their computer on mute, they might miss it. Make sure your first slide gives the instruction to turn up the volume.
If your visual presentation is going to be used as a backdrop for a speech, you can forgo some of the text and make it more visual. Remember to rehearse your speech along with the slides so it all flows seamlessly. TED speakers suggest you rehearse a spoken presentation at least 10 times until it flows naturally. Orana is an artist of many trades, currently working as a graphic designer for bloggers and small businesses. Her love of art and travel create the perfect artist-nomad combination. She founded Orana Creative to help freelancers, solopreneurs and bloggers master a better visual strategy.
7 Ways Visual Aids Can Improve Your Next Presentation - Hitachi Digital Media Group
She is passionate about eye happiness and loves constructive criticism. Thank you this was so useful to identify visually and written the many different structures. Love the infographics. As a speaker you are finding you have never arrived. There is always more to learn which is great. Thank you. Thanks for the feedback, Elaine. Hi, Guide of your is just amazing. Can you slove my query.
If create a broad theme, define the fonts, color schemes and freeze the presentation style is oky or not? Please reply to this at your earliest. Hi Steave I think you mean, if you can make a template to use again another time? Yes, you can do that with Visme! The above article explains how to present the paper presentation for the conference and the business and so on.
Thanks for sharing the article. Hi Jose. Your email address will not be published. Save my name, email, and website in this browser for the next time I comment. Want to set yourself apart from the rest? Start creating amazing presentations within minutes with our easy drag-and-drop software. We're trending on Product Hunt Today!
Learn more and Vote for us on Product Hunt. We're trending there! Written by: Orana Velarde. Create your own engaging infographics like this with this drag-and-drop tool. Try It for Free. Create your own graphics like this with this drag-and-drop tool. Make your Presentation. Recommended content for you:. Speak Loudly. Speak Visually. Receive weekly practical tips on how to communicate visually, right in your inbox.
Create yours. Your browser does not support HTML5 video. About the Author Orana is an artist of many trades, currently working as a graphic designer for bloggers and small businesses. July 10, at am. Elaine Powell says:. January 5, at am. Payman Taei says:.