J Am Acad Audiol 2020; 31(09): 636-645
DOI: 10.1055/s-0040-1717123
Research Article

A Content Analysis of YouTube Videos Related to Hearing Aids

Vinaya Manchaiah
1   Department of Speech and Hearing Sciences, Lamar University, Beaumont, Texas
2   Department of Speech and Hearing, School of Allied Health Sciences, Manipal, Karnataka, India
,
Monica L. Bellon-Harn
1   Department of Speech and Hearing Sciences, Lamar University, Beaumont, Texas
,
Marcella Michaels
1   Department of Speech and Hearing Sciences, Lamar University, Beaumont, Texas
,
Vinay Swarnalatha Nagaraj
3   Audiology Program, Department of Neuromedicine and Movement Science, Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology, Trondheim, Norway
,
Eldré W. Beukes
1   Department of Speech and Hearing Sciences, Lamar University, Beaumont, Texas
4   Department of Vision and Hearing Sciences, Anglia Ruskin University, Cambridge, United Kingdom
› Author Affiliations
 

Abstract

Background Increasingly, people access Internet-based health information about various chronic conditions including hearing loss and hearing aids. YouTube is one media source that has gained much popularity in recent years.

Purpose The current study examines the source, content, understandability, and actionability of YouTube videos related to hearing aids.

Research Design Cross-sectional design by analyzing the videos at single point in time.

Study Sample One hundred most frequently viewed videos in YouTube.

Intervention Not applicable.

Data Collection and Analysis The 100 most-viewed English language videos targeting individuals seeking information regarding hearing aids were identified and manually coded. Data collection included general information about the video (e.g., source, title, authorship, date of upload, duration of video), popularity-driven measures (e.g., number of views, likes, dislikes), and the video source (consumer, professional, or media). The video content was analyzed to examine what pertinent information they contained in relation to a predetermined fact sheet. Understandability and actionability of the videos were examined using the Patient Education Material Assessment Tool for Audiovisual Materials.

Results Of the 100 most-viewed videos, 11 were consumer-based, 80 were created by professionals, and the remaining 9 were media-based. General information about hearing aids, hearing aid types, and handling and maintenance of hearing aids were the most frequently discussed content categories with over 50% of all videos commenting on these areas. Differences were noted between source types in several content categories. The overall understandability scores for videos from all sources were 74%, which was considered adequate; however, the actionability scores for all the videos were 68%, which is considered inadequate.

Conclusion YouTube videos about hearing aids focused on a range of issues and some differences were found between source types. The poor actionability of these videos may result in incongruous consumer actions. Content and quality of the information in hearing aid YouTube videos needs to be improved with input from professionals.


#

Digital advancement has increased the number of people using the Internet, especially as a resource for health information.[1] [2] Flanagin and Metzger[3] concluded that information gathered on the Internet for health-related conditions was more frequent than information gathered from other sources such as television, radio, or magazines. The Internet provides an engaging means for patients to access easily available information across various health-related conditions.[4]

Worldwide, older adults are the fastest expanding community of Internet users. In the U.S., the number of smartphone and Internet users has doubled between 2013 and 2017.[5] Akkermans[6] found that Internet usage among older adults in the Netherlands was notably increasing with nearly 60% of older adults (i.e., 65–75 years) using the Internet in 2011. Internet usage among older adults in Europe was previously found to be 81%, and of those, 54% used the Internet to seek health-related information.[7] [8]

Internet health information access by older adults seems to have increased for hearing and other chronic diseases (e.g., diabetes, arthritis). Increases are reported on both Internet websites and social media. Henshaw et al[9] suggested that older adults (50–74 years) in the United Kingdom experiencing slight hearing difficulty have increased odds of greater computer skill and Internet use than those reporting no difficulty. Thorén et al[10] indicated that over 60% of adults with hearing loss living in Sweden used computers and the Internet. When investigating the use of social media among those using hearing aids, Choudhury et al[11] found it was used for relationship building, support, and information sharing.

The proliferation of the Internet has provided an opportunity for patients to search for and gather health-related information that was previously unavailable to them. Despite these advantages, seeking health-related information from the Internet creates challenges. Specifically, there are concerns regarding the accessibility of credible and accurate Internet-based health information.[3] [12] Many users tend to select both relevant and irrelevant pages on the Internet.[13] People increasingly rely on Internet information without considering the authenticity and the accuracy of the information. Consequently, there is potential for people to be misled with the information available on the Internet.[14]

YouTube is a popular social media platform and it is ranked the second most popular website globally.[15] YouTube provides an outlet for health-related information developed by professionals, health organizations, and/or patients.[16] A systematic review published in 2015 found that YouTube is increasingly used as a platform for disseminating health information.[17] However, the quality of information was determined to be variable with information from governmental and professional organizations being higher quality and more authentic.[17] The authors suggested that more information and intervention should be available to support consumers in their critical evaluation of the information posted on YouTube; therefore, allowing the consumer to use the information to make effective health care decisions. As such, professionals need to examine the content, understandability, and actionability of video-based Internet information.

Understandability is conceptually defined as the ability of people from diverse backgrounds with varying health literacy abilities to comprehend educational materials and extract key messages.[18] Actionability refers to the ability of learners to identify what actions can be taken on the basis of educational material information.[18] Within hearing health care, only one study has examined Internet-based video information. Basch et al[19] examined information about tinnitus contained in the most widely viewed videos, as well as the source upload of these videos, on YouTube. Of the most frequently viewed 100 videos, most were uploaded by consumers (i.e., 42%), which mainly consisted of personal experiences; however, the authors did not include measures of understandability and actionability of video information.

There are many tools available to evaluate text-based online and offline materials (e.g., readability, understandability, quality of treatment information).[20] These tools have been used to evaluate print materials and have been extended to evaluate online text materials of hearing-related information.[21] [22] However, these tools do not include analysis of audio-visual information. One validated method to examine understandability and actionability of audio-visual information is the Patient Education Material Assessment Tool for Audiovisual Materials (PEMAT-A/V).[23] [24] PEMAT-A/V has been recently used to evaluate information directed at a patient audience.[25]

Gabarron et al[26] examined the most frequently used methods to evaluate online videos. Expert ratings were the primary method, but due to the volume of online videos, expert ratings may not always be feasible. The second most frequently used evaluation was popularity (e.g., public ratings such as view count). Unfortunately, popularity of videos may be manipulated or misleading. For example, a view is counted following 30 seconds of watching a YouTube video (Marketing Land, 2015). An individual may not watch the full video and so interpretation of popularity may be speculative. Third, meta-data (e.g., video length, number of views) was noted. Examining the video length relative to other meta-data (e.g., thumbs up, thumbs down) may provide information regarding how populations interact with the videos during searches or viewing.[27]

Summary and Study Purpose

Due to the presentation of both visual and auditory information on video formats, YouTube is often a popular choice to seek health care information. Determining the quality of health care videos is required to inform service providers and clinicians. Videos can be evaluated through multiple methods and tools (e.g., meta-data, source, PEMAT-A/V). Evaluating online material across multiple dimensions increases the strength of the evaluation.[20] As noted, only one study has evaluated hearing health care videos. Specifically, the content and source of tinnitus videos were examined.[19] This work extends recent work examining Internet-based hearing health video-based information. The purpose of this study was to examine the source, content, understandability, and actionability of YouTube videos related to hearing aids.


#

Method

Study Design

A cross-sectional study design was used. The study design and method was inspired by some recent YouTube studies on other health areas such as tinnitus, autism spectrum disorder, prostate cancer, and skin cancer.[19] [25] [28] [29] This study did not require ethical approval as it does not involve human subjects.


#

Data Extraction

Data extraction aimed to identify the 100 most widely viewed English language videos related to hearing aids on YouTube. The decision to include only 100 videos were in line with exploratory nature of the study and consistent with the previously published studies.[19] [25] [28] The main rationale is that the most-viewed are likely to pop-up when people search the keywords and also as a suggestion when the YouTube users watch a similar video. In other words, the most-viewed videos are most likely to be accessed by users who are searching information in the specific area. It is noteworthy that there is a time advantage associated with video popularity. For instance, videos published in the year 2012 are more likely to have higher chances that it has more views and likes than a video published in 2018. In contrast, we believe that the content of the video is likely to drive the popularity more than other factors, such as time; therefore, videos published more recently may have more views and likes than the video published in 2012 due to content that may appeal to more users. Hence, using a popularity-based inclusion criteria was deemed appropriate. In addition, the inclusion criteria were that the video needed to be available in English and present explicit information related to hearing aids. Videos were excluded if their focus was not on hearing aids (i.e., assistive listening devices, implantable devices, assessment procedures) or included nonexplicit information (e.g., song).

A wide inclusion criteria with broad search terms was used in order to simulate a search that an individual in the general public may perform. The search term that was used was “hearing aid.” The number of views for each video was recorded to identify the 100 most widely viewed videos. Search results are variable on YouTube depending on the (1) type of Internet browser, (2) time of search, and (3) whether the researchers have logged in to their personal YouTube (or Gmail) account. To minimize the user-targeted search results, the browser history was deleted, cookies were cleared, and the search was performed in a private mode on the Mozilla Firefox browser (Version 62.0.3).

After searching and applying the inclusion criteria, a total of 145 videos were extracted. Of these, 45 were excluded as they did not meet the inclusion criteria and did not have relevant content as listed in [Table 1]. Exclusions included: (1) non-English video (n = 1); (2) no longer available on YouTube (n = 2); and (3) information not related to hearing aids (n = 42). Those not related to hearing aids included information about: (1) assistive listening devices (n = 4); (2) implantable devices (n = 7); (3) reaction about a baby or a child wearing a hearing aid (n = 11); (4) hearing loss, for example, ear impression, ear wax, and audiological evaluation (n = 7); and (5) play, song, and TV show about hearing aids (n = 11).

Table 1

The content category and description that was used in coding the YouTube video content

Content category

Description

Hearing mechanism

Descriptions regarding the auditory system and the sensation of hearing in both normal and abnormal auditory systems

Information about hearing loss

Explanations about hearing loss including types or degree of hearing loss, causes of hearing loss, and consequences of hearing loss

Hearing aid type

Highlighting the different types of hearing aids including Body Worn, Behind-the-Ear (BTE), In-the-Ear (ITE), In-the-Canal (ITC), Receiver-in-the Canal (RIC), Open fit, etc.

Hearing aid features and functionalities

Outlining different hearing aid features and functionalities including analog versus digital, microphone technology, signal processing strategies, feedback cancellation, telecoil, etc.

Handling and maintenance of hearing aid

Accounts of different hearing aid controls (e.g., on/off switch, changing programs, telecoil), linking to smartphone apps, and/or care and maintenance (e.g., cleaning, battery change) of hearing aids

Benefits of hearing aids

Reference to the advantages of hearing aids in various listening conditions (e.g., daily living, occupational)

Limitations or side effects of hearing aids

Coverage regarding the possible limitations of wearing hearing aids (i.e., does not restore normal hearing, amplifies background noise) or side effects (e.g., skin irritation, headaches, feedback, improper sound quality, negative self-image)

Cost of hearing aid and reimbursement

Discussing the cost of hearing aids and reimbursement (e.g., insurance)

Hearing aid purchasing process

Guidelines regarding the purchasing process through regular channels such as visiting a hearing care professional (e.g., audiologist, hearing aid dispenser, otolaryngologist) or through direct-to-consumer model (e.g., pharmacy stores, online)

Featuring a celebrity with hearing aids

Raising public awareness by focusing on a celebrity using hearing aids

The purpose of the video

Purpose was categorized into: (1) general information about hearing aids, (2) personal experiences about hearing aids, or (3) promotional information to sell a product or service

Of the 100 videos included, data were extracted from each identified video regarding general information, their source, popularity, and purpose as follows:

  • General Information: the title, URL, authorship, date of upload, and duration of video.

  • The Video Sources: the source of the video were categorized as: (1) consumer (member of the lay public); (2) professional (a credentialed person, qualified to discuss the topic); (3) television-based clip (any clip that originated from television); and (4) Internet-based clip (any clip that originated from an Internet channel or website).

  • Video Popularity: the number of views, likes, dislikes.


#

Video Content and Quality Evaluation

First, the content of the videos was examined. Second, the understandability and actionability of the videos were examined.

Content Analysis

The video content was analyzed to examine what pertinent information they contained in relation to a predetermined fact sheet. The fact sheet was developed considering information that may be of value when looking up information on hearing aids based on fact sheets of the American Academy of Audiology, American Speech-Language Hearing Association, Hearing Loss Association of America, and National Institute on Deafness and Other Communication Disorders. Each video was coded as 1 for including or 0 for not-including by following the predeveloped content categories provided in [Table 1].


#

Assessment of Understandability and Actionability

PEMAT-A/V is a free, publicly available tool developed for the Agency for Healthcare Research and Quality to assess understandability and actionability of audiovisual patient education materials.[24] Strong internal consistency, reliability, and construct validity of PEMAT-A/V are reported.[24] The PEMAT-A/V has 17 items. Thirteen items are related to understandability and 4 items are related to actionability. Each item is scored as agree (score of 1), disagree (score of 0), or not applicable (no score and noted as not applicable). Of the 13 items related to understandability, item 12 was not included (i.e., the material uses visual cues [e.g., arrows, boxes, bullets, bold, larger font, highlighting] to draw attention to key points) because per the PEMAT-A/V instruction, this item is N/A for videos. Item 19 (i.e., the material uses simple tables with short and clear row and column headings) was also not used as no tables were included on any videos. The materials were scored separately for understandability and actionability by summing the total points and dividing the sum by the total possible points. This score was multiplied by 100 to get a percentage for each subscale. Higher percentages indicated higher understandability and actionability with scores under 70% indicating that the information had poor understandability or actionability.[24]

One of the researchers completed PEMAT-A/V ratings for all of 100 videos, whereas the 20% of the randomly chosen videos were rated by another researcher. Both researchers had previously used PEMAT-A/V and were thus familiar with the system. The interrater reliability of understandability and actionability scores was examined.


#
#

Data Analysis

Statistical analysis was conducted using the IBM SPSS Software Version 24. The descriptive statistics were examined. Nonparametric tests were chosen for further analyses as the video meta-data and PEMAT-A/V scores data failed the Shapiro–Wilk normality tests. The Kruskal–Wallis H test was used to examine if the meta-data and the PEMAT-A/V scores varied across the video source (i.e., professional, television-based, Internet-based, consumer). A pairwise analysis was performed using the Bonferroni post hoc test for the variables that found significance in the Kruskal–Wallis H test. Spearman's correlation was performed to examine the correlation between videos meta-data. Manually coded video content into different themes were converted into multiple binary variables (i.e., coded as 0 if video is not presenting information about a specific theme and coded as 1 if the video is presenting information about a specific theme). Associations between video content in terms of themes and the video source (categorical variables) were examined using the chi-square analysis. The interclass correlation coefficient was performed to examine the interrater reliability for PEMAT-A/V subscale ratings. A significance level of 0.05 was used for interpretation of results.


#
#

Results

Video Source and Popularity

Of the 100 most-viewed videos identified on YouTube, 80 were created by professionals, 11 were consumer-created, and the remaining 9 were media based. [Table 2] presents the descriptive data of the popularity-based meta-data for these videos for different video sources. The collective number of views of the videos was over 13 million. The length of videos for all 100 videos was 466 minutes (i.e., 7.45 hours) with the shortest video being 26 seconds and the longest video being 20 minutes and 27 seconds. The total number of thumbs-up (likes) and thumbs-down (dislikes) for these videos were 99,787 and 3,091, respectively.

Table 2

Descriptive statistics of meta-data (i.e., number of views, video length, thumbs-up, and thumbs-down) in 100 most-viewed hearing aids YouTube videos in English by their source (consumer = 11; professional = 80; media = 9)

Mean

Median

Min to Max

Standard deviation

Standard error

95% confidence interval

Total

Number of views

 Consumer

77,563

47,521

32,911 to 223,042

69,974

21,098

30,553 to 124,573

 Professional

73,916

49,514

27,955 to 422,085

65,293

7,300

59,386 to 88,446

 Media

724,184

83,196

34,502 to 5,787,356

1,899,939

633,313

–736,237 to 2,184,607

 All

132,841

49,514

27,955 to 5,787,356

574,971

57,491

18,765 to 246,917

13,284,180

Video length (min:s)

 Consumer

8:57

8:33

4:27 to 12:59

2:55

00:52

7:00 to 10:55

 Professional

3:57

3:22

0:26 to 10:50

2:48

0:18

3:20 to 4:35

 Media

5:41

3:50

00:26 to 20:27

5:50

1:56

1:11 to 10:10

 All

4:39

4:06

00:26 to 20:27

3:31

00:21

3:57 to 5:21

7:46:00 (466 min)

Thumbs-up

 Consumer

1,217

240

110 to 8,800

2,576

776

–514 to 2,948

 Professional

129

71

0 to 1,500

302

23

84 to 174

 Media

8,451

120

0 to 75,000

24,955

8,318

–10,731 to 27,634

 All

997

89

0 to 75,000

7,529

752

–496 to 2,491

99,787

Thumbs-down

 Consumer

34

26

9 to 103

31

9

13 to 55

 Professional

19

11

0 to 124

27

3

13 to 25

 Media

130

13

0 to 883

228

96

–91 to 351

 All

30

12

0 to 883

91

9

13 to 49

3,091

Association between Video Source and Meta-Data

The Kruskal–Wallis H test was performed to examine if the meta-data differed between video sources. Video length (chi-square = 17.5, p < 0.001), thumbs-up (chi-square = 15.3, p < 0.001), and thumbs-down (chi-square = 6.5, p = 0.04) showed significant differences between video sources, but no significant difference was found for number of views (chi-square = 2.37, p = 0.30) between video sources. For video length, the pairwise comparisons with Bonferroni post hoc tests showed that consumer videos were significantly different when compared with professional videos (p < 0.001) and also consumer videos were significantly different when compared with media videos (p = 0.04). For thumbs-up, consumer videos were significantly different when compared with professional videos (p < 0.01). However, for thumbs-down, no significant difference between different source categories was found with Bonferroni post hoc tests.


#

Association between Different Types of Meta-Data

Spearman's rho correlation test was performed to examine the relationships between meta-data. Number of views had a small positive correlation with thumbs-up (r = 0.28, p ≤ 0.01) and thumbs-down (r = 0.41, p ≤ 0.01). Video length had a moderate positive correlation with thumbs-up (r = 0.53, p ≤ 0.01) and a small positive correlation with thumbs-down (r = 0.28, p ≤ 0.01). Thumbs-up had a strong positive correlation with thumbs-down (r = 0.79, p ≤ 0.01).


#
#

Video Content

The video content of the 100 most-viewed YouTube videos was coded according to 11 predetermined themes. [Table 3] presents the percentage of videos presenting information about each of these themes and the chi-square analysis results examining the association between video source and content theme. Hearing aid types, hearing aid features and functionalities, handling and maintenance of hearing aids, and benefits for hearing aids were the categories that were covered in most of the videos across source categories. Also, some association between video source and content categories were noted (see [Table 3]). For instance, content of consumer videos primarily included information about hearing loss, hearing aid type, hearing aid maintenance, hearing aid benefits and limitations, hearing aid purchasing process, and their personal experiences related to hearing aids. Content of professional and media-generated videos focused on hearing aid type and general information about hearing aids. The cost of hearing aids and reimbursement were more frequently discussed in videos made by consumers and the media, when compared with those by professionals. Overall, these results suggest that the YouTube videos related to hearing aids cover a range of issues and there are some commonalities and differences in the video content across video sources.

Table 3

Percentage of videos presenting specific theme content in the 100 most-viewed hearing aids-related YouTube videos by their source and contents

Content

Source category of video

Association with source

All

Consumer

Professional

Media

Chi-square

p-Value

Hearing mechanism

5

27

1

11

14.6

0.001

Information about hearing loss

22

73

14

33

20.34

< 0.001

Hearing aid type

71

82

70

67

0.75

0.68

Hearing aid features and functionalities

27

36

24

45

2.3

0.31

Handling and maintenance of hearing aid

51

82

48

45

4.7

0.09

Benefits of hearing aids

40

82

32

56

10.8

0.005

Limitations or side effects of hearing aids

26

73

19

33

14.9

0.001

Cost of hearing aids and reimbursement

24

82

13

56

30.8

< 0.001

Hearing aid purchasing process

29

73

21

45

13.6

0.001

Featuring a celebrity

8

9

3

56

30.9

< 0.001

Purpose of video:

 (a) General information about hearing aid

72

9

80

78

24.3

< 0.001

 (b) Personal experience about hearing aid

14

100

1

22

78.8

< 0.001

 (c) Sell a product or service

32

9

33

56

4.8

0.08


#

Understandability and Actionability

The understandability and actionability of the YouTube video content were examined using the PEMAT-A/V. The interclass correlation coefficient for understandability and actionability subscales were 0.79 and 0.89, respectively, suggesting good interrater reliability. [Table 4] presents the descriptive statistics for the PEMAT-A/V individual items ratings. With regard to understandability, over 80% of the videos presented the information in a logical sequence (i.e., item 10), used active voice adequately (i.e., item 5), and defined medical terms when used (i.e., item 4). The videos also had adequate rating (i.e., over 70%) in items related to purpose (i.e., item 1), use of everyday common language (i.e., item 3), and use of easily readable text on screen (i.e., item 13). On the other hand, a large number of videos did not include informative headers, scoring lowest on item 9. In the actionability subscale, addressing the identification of at least one action clearly (i.e., item 20) and also identifying user directly (i.e., item 21) had adequate scores. Inadequate scores were obtained for indicating action into manageable, explicit steps, and explaining how to use the charts, graphs, tables, or diagrams to take actions.

Table 4

Descriptive statistics of the Patient Education Materials Assessment Tool for Audiovisual Materials (PEMAT-A/V) items

PEMAT-A/V factors and items

Frequency in %

Disagree

Agree

Not applicable

Subscale: Understandability

Topic: Content

Item 1: The material makes its purpose completely evident

22

78

0

Topic: Word choice and style

Item 3: The material uses common, everyday language

22

78

0

Item 4: Medical terms are used only to familiarize audience with the terms. When used, medical terms are defined

16

84

0

Item 5: The material uses the active voice

4

96

0

Topic: Organization

Item 8: The material breaks or “chunks” information into short sections

35

55

10

Item 9: The material's sections have informative headers

63

26

11

Item 10: The material presents information in a logical sequence

13

87

0

Item 11: The material provides a summary

33

56

11

Topic: Layout and design

Item 13: Text on screen is easy to read

4

72

24

Topic: Use of visual aids

Item 14: The material allows the user to hear the words clearly (e.g., not too fast, not garbled)

32

58

10

Item 18: The material uses illustrations and photographs that are clear and uncluttered

10

41

49

Subscale: Actionability

Item 20: The material clearly identifies at least one action the user can take

31

69

0

Item 21: The material addresses the user directly when describing actions

24

76

0

Item 22: The material breaks down any action into manageable, explicit steps

43

57

0

Item 25: The material explains how to use the charts, graphs, tables, or diagrams to take actions

3

6

91

[Table 5] presents the understandability and actionability scores across video sources. The overall understandability scores for videos from all sources together were 74%, which was considered adequate. However, the actionability scores for all the videos were 68%, which is considered inadequate. The results of Kruskal–Wallis H test showed a significant difference in understandability scores between videos from different sources (chi-square = 10.14, p = 0.006), but no significant difference in actionability scores between videos from different sources (chi-square = 2.08, p = 0.35). The pairwise comparisons of understandability scores with Bonferroni post hoc tests showed that consumer videos were significantly different when compared with professional videos (p = 0.006). However, no other significant differences were found.

Table 5

Patient Education Materials Assessment Tool for Audiovisual Materials (PEMAT-A/V) scores across video source categories (professional = 24; consumer = 34; television-based = 19; Internet-based = 23)

Source

Mean

Median

Min to Max

Standard deviation

Standard error

95% confidence interval

Understandability

 Consumer

60.36

56

22 to 100

18.92

5.7

47.65 to 73.08

 Professional

76.14

79

27 to 100

14.01

1.56

73.02 to 79.26

 Media

71.33

73

50 to 83

10.37

3.45

63.36 to 79.3

 All

73.9

73

22 to 100

15.05

1.5

70.98 to 76.96

Actionability

 Consumer

57.64

67

0 to 100

31.37

9.45

36.56 to 78.71

 Professional

69.46

100

0 to 100

38.85

4.34

60.82 to 78.11

 Media

66.56

100

0 to 100

40.93

13.64

35.1 to 98.01

 All

67.90

100

0 to 100

38.12

3.81

60.34 to 74.46


#
#

Discussion

This study examined source, content, understandability, and actionability of YouTube videos related to hearing aids. Results indicated that a majority of the information related to hearing aids that is available on the Internet were created by professionals (n = 80), which is not consistent with Basch et al[19] who concluded that majority of information on the Internet was uploaded by consumers. Interestingly, the mean number of views for the videos was nearly 10 times higher for media-created videos (724,184) than for either professional (n = 73,916) or consumer-created videos (n = 77,563). The media-created videos also had many more likes (n = 24, 955) compared with consumer (n = 776) and professional videos (n = 23). One possible reason may be that media-created videos were more likely to feature a celebrity than consumer and professional videos. Celebrities may have attracted more views. This is understandable in a consumer-driven population where celebrity endorsement has been found to increase brand credibility and equity.[30] [31] [32]

Media-created videos were longer on average (5.41 minutes) than the professional videos (3.57 minutes) but shorter than the consumer-led videos (8.57 minutes). Lengthy videos can be distracting and can result in the viewers not watching the entire videos due to various reasons; however, the present study indicated a moderate relationship between video length and thumbs-up, with the media-created studies having many more thumbs up than the other types of videos. These results are consistent with previous literature. For instance, the longer view durations are associated with higher video view counts, the number of likes, and positive review comments.[33] These results highlight the importance of maintaining a balance in holding the attention of the viewers through shorter videos and obtaining popularity through longer duration videos.

The videos cover a wide range of information related to the hearing mechanism and hearing loss as well as types, cost, benefits, and limitations of hearing aids. Out of the 100 videos reported, only five videos discussed hearing mechanisms, while the majority focused on information such as different types of hearing aids and hearing aid maintenance. Moreover, the current study highlighted some commonalities and differences in video content across video sources. It is suspected that the variations of video content may be that the rationale for creating the videos, as well as the knowledge and skills of consumers, professionals, and media professionals, had varied. It is important to note that this study does not suggest that any one type of content is better than others, rather the analyses highlight the type of content likely to be found based on its source. For instance, consumer-developed videos provided more comprehensive information about hearing aid purchasing process and cost–benefit analysis when compared with professionals and media sources.

With regard to understandability, many items were rated as superior or adequate. Videos presented the information in a logical sequence, used active voice adequately, defined medical terms, demonstrated a clear purpose, used every day common language, and used easily readable text on screen. It may be that with regard to understandability, informative headers may not be relevant in a video or may not be required if other components are adequate or superior. The lowest score was related to clear use of illustrations and photographs; however, 49% of the videos did not include illustrations and photographs, which may not be surprising in light of the video medium. Taken together, the overall understandability of hearing aid-related videos was acceptable.

On the other hand, actionability did not receive any superior ratings. The clear identification of at least one action and identifying the user directly were rated as adequate. The majority of videos did not break down any action into manageable, explicit steps, and received an inadequate rating. The inadequate rating on the item related to whether or not the material explained how to use the charts, graphs, tables, or diagrams to take actions may have contributed to the overall inadequate score in actionability. It should be noted that 91% of the videos did not include charts, graphs, tables, or diagrams, which may not be surprising in light of the video medium.

Further analyses revealed that videos uploaded by professionals were superior in understandability and actionability than other video sources, which is consistent with previous research in health care videos.[25] Despite professional videos being of a higher quality, media-based videos received more views and likes. Consumers are thus exposed to more information that may not always be of adequate quality. The realization that popularity-driven factors other than quality attract viewers is important during the development and marketing of videos containing higher quality information.


#

Clinical Implications

To provide educational and community outreach, professionals should be aware of the kind of information to which patients may be exposed. As such, they will be prepared to clarify the queries of the patients and educate patients about the accuracy of the information to which they are exposed. Professionals should set aside time to provide appropriate and relevant knowledge to patients and help clarify any myths that the patients might have. Professionals can overcome barriers to discussing online information with their clients (e.g., clients' concern about professional's reactions) by sharing online information that may be beneficial. Professionals can educate their clients on the ability to seek, find, understand, and critically evaluate information from electronic sources (e.g., identify good search terms and credible sources). Finally, professionals need to contribute to the digital landscape by generating evidence-based, accessible information across diverse content.


#

Study Limitations and Further Research

The present study aimed to study the source, content, understandability, and actionability of information related to hearing aids uploaded to YouTube. The present study has some limitations in that the context in which the video was created and uploaded was not considered. This is a drawback since context can have an influence on the content of the information related to hearing aids. Furthermore, misinformation related to hearing aids was not considered in the present study. The results of the present study indicate the need for future studies that can examine and quantify the misinformation. Also, information analyses in the current study were done by doctoral students and professionals within the area of audiology. This can result in rating bias regarding assessing the appropriateness of uploaded information. Future studies that include nonclinical individuals might provide a better understanding about the information outcome regarding hearing aid videos. Future studies can also examine the relationship among cultural appropriateness, usability, and actionability. Studies focusing on more specific topics (e.g., diagnosis, management) would be of value. YouTube now provides results based on more relevant videos. Future studies can focus on answering specific question by examining the most relevant videos and the factors contributing to why these were selected to be watched. .


#

Conclusion

This study provided insights into the information presented on YouTube regarding hearing aid. It is important that the information related to hearing aids uploaded on the Internet is appropriate and relevant. It was found that videos with higher quality were not necessarily the ones that had the highest views or were the most popular. Ensuring videos with higher quality are developed and accessed is important. This work contributes to research in the area of consumer health informatics, which is concerned with examining multiple consumer or client perspectives. Studies such as this are important to examine client information from distinct areas such as health literacy and education. This information can then be used to provide health information that enables clients to make their own decisions. Further studies examining hearing aid information from various sources (e.g., news media, social media) to which clients are exposed will help further understand their knowledge, attitudes, and behaviors. This in turn, may help in developing appropriate and evidence-based client information and resources related to hearing aids.[34]


#
#

Conflict of Interest

None.

  • References

  • 1 Fox S. The social life of health information, 2011. 2015 Accessed August 11, 2019 at: http://www.pewinternet.org/files/old-media/Files/Reports/2011/PIP_Health_Topics.pdf
  • 2 Van de Belt TH, Engelen LJ, Berben SA, Teerenstra S, Samsom M, Schoonhoven L. Internet and social media for health-related information and communication in health care: preferences of the Dutch general population. J Med Internet Res 2013; 15 (10) e220
  • 3 Flanagin AJ, Metzger MJ. Perceptions of Internet information credibility. Journal Mass Commun Q 2000; 77 (03) 515-540
  • 4 Cline RJW, Haynes KM. Consumer health information seeking on the Internet: the state of the art. Health Educ Res 2001; 16 (06) 671-692
  • 5 Pew Research Center. Technology use among seniors. 2017 Accessed December 03, 2019 at: https://www.pewresearch.org/internet/2017/05/17/technology-use-among-seniors/
  • 6 Akkermans M. Ouderen maken inhaalslag op het internet [Elderly making inroads on the internet]. 2014 Accessed August 11, 2019 at: http://www.cbs.nl/nl-NL/menu/themas/vrije-tijd-cultuur/publicaties/artikelen/archief/2011/2011-3537-wm.htm
  • 7 Department of Economic and Social Affairs Population Division. World Population Ageing 2015. 2015 Accessed August 11, 2019 at: https://www.un.org/en/development/desa/population/publications/pdf/ageing/WPA2015_Report.pdf
  • 8 Medlock S, Eslami S, Askari M. et al. Health information-seeking behavior of seniors who use the Internet: a survey. J Med Internet Res 2015; 17 (01) e10
  • 9 Henshaw H, Clark DP, Kang S, Ferguson MA. Computer skills and internet use in adults aged 50-74 years: influence of hearing difficulties. J Med Internet Res 2012; 14 (04) e113
  • 10 Thorén ES, Oberg M, Wänström G, Andersson G, Lunner T. Internet access and use in adults with hearing loss. J Med Internet Res 2013; 15 (05) e91
  • 11 Choudhury M, Dinger Z, Fichera E. The utilization of social media in the hearing aid community. Am J Audiol 2017; 26 (01) 1-9
  • 12 Go E, You KH, Jung E, Shim H. Why do we use different types of websites and assign them different levels of credibility? Structural relations among users' motives, types of websites, information credibility, and trust in the press. Comput Human Behav 2016; 54: 231-239
  • 13 Salmerón L, Fajardo I, Gómez-Puerta M. Selection and evaluation of Internet information by adults with intellectual disabilities. Eur J Spec Needs Educ 2019; 34 (04) 272-284
  • 14 Diaz JA, Griffith RA, Ng JJ, Reinert SE, Friedmann PD, Moulton AW. Patients' use of the Internet for medical information. J Gen Intern Med 2002; 17 (03) 180-185
  • 15 Clement J. Most popular social networks worldwide as of July 2019. 2019 . Accessed August 8, 2019 at: https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/
  • 16 Wong CA, Ostapovich G, Kramer-Golinkoff E, Griffis H, Asch DA, Merchant RM. How U.S. children's hospitals use social media: a mixed methods study. Healthc (Amst) 2016; 4 (01) 15-21
  • 17 Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube: a systematic review. Health Informatics J 2015; 21 (03) 173-194
  • 18 Zuzelo PR. Understandability and actionability: using the PEMAT to benefit health literacy. Holist Nurs Pract 2019; 33 (03) 191-193
  • 19 Basch CH, Yin J, Kollia B. et al. Public online information about tinnitus: a cross-sectional study of YouTube videos. Noise Health 2018; 20 (92) 1-8
  • 20 Beaunoyer E, Arsenault M, Lomanowska AM, Guitton MJ. Understanding online health information: evaluation, tools, and strategies. Patient Educ Couns 2017; 100 (02) 183-189
  • 21 Laplante-Lévesque A, Brännström KJ, Andersson G, Lunner T. Quality and readability of English-language internet information for adults with hearing impairment and their significant others. Int J Audiol 2012; 51 (08) 618-626
  • 22 Manchaiah V, Dockens AL, Flagge A. et al. Quality and readability of English-language Internet information for tinnitus. J Am Acad Audiol 2019; 30 (01) 31-40
  • 23 Agency for Healthcare Research & Quality. The Patient Education Materials Assessment Tool (PEMAT) and User's Guide: an instrument to assess the understandability and actionability of print and audiovisual patient education materials. 2013 Accessed January 18, 2020 at: https://www.ahrq.gov/professionals/prevention-chronic-care/improve/self-mgmt/pemat/index.html
  • 24 Shoemaker SJ, Wolf MS, Brach C. Development of the Patient Education Materials Assessment Tool (PEMAT): a new measure of understandability and actionability for print and audiovisual patient information. Patient Educ Couns 2014; 96 (03) 395-403
  • 25 Bellon-Harn ML, Manchaiah V, Morris L. A cross-sectional descriptive analysis of portrayal of Autism Spectrum Disorders in YouTube videos: a short report. Autism 2020; 24 (01) 263-268
  • 26 Gabarron E, Fernandez-Luque L, Armayones M, Lau AY. Identifying measures used for assessing quality of YouTube videos with patient health information: a review of current literature. Interact J Med Res 2013; 2 (01) e6
  • 27 Van den Eynde J, Crauwels A, Demaerel PG. et al. YouTube videos as a source of information about immunology for medical students: cross-sectional study. JMIR Med Educ 2019; 5 (01) e12605
  • 28 Basch CH, Menafro A, Mongiovi J, Hillyer GC, Basch CE. A content analysis of YouTube™ videos related to prostate cancer. Am J Men Health 2017; 11 (01) 154-157
  • 29 Ruppert L, Køster B, Siegert AM. et al. YouTube as a source of health information: analysis of sun protection and skin cancer prevention related issues. Dermatol Online J 2017; 23 (01) pii : 13030/qt91401264
  • 30 Chan K, Zhang T. An exploratory study on perception of celebrity endorsement in public services advertising. Int Rev Public Nonprofit Mark 2019; 16 (2–4): 195-209
  • 31 Jin SV. “Celebrity 2.0 and beyond!” Effects of Facebook profile sources on social networking advertising. Comput Human Behav 2018; 79: 154-168
  • 32 Spry A, Pappu R, Cornwell TB. Celebrity endorsement, brand credibility and brand equity. Eur J Mark 2011; 45 (06) 882-909
  • 33 Park M, Naaman M, Berger J. 2016 A data-driven study of view duration on YouTube. Proceedings of the Tenth International AAAI Conference on Web and Social Media (ICWSM 2016). Accessed December 4, 2019 at: https://arxiv.org/abs/1603.08308
  • 34 Eysenbach G, Sa ER, Diepgen TL. Shopping around the Internet today and tomorrow: towards the millennium of cybermedicine. BMJ 1999; 319: 1294

Address for correspondence

Vinaya Manchaiah, PhD

Publication History

Article published online:
20 November 2020

© 2020. American Academy of Audiology. This article is published by Thieme.

Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA

  • References

  • 1 Fox S. The social life of health information, 2011. 2015 Accessed August 11, 2019 at: http://www.pewinternet.org/files/old-media/Files/Reports/2011/PIP_Health_Topics.pdf
  • 2 Van de Belt TH, Engelen LJ, Berben SA, Teerenstra S, Samsom M, Schoonhoven L. Internet and social media for health-related information and communication in health care: preferences of the Dutch general population. J Med Internet Res 2013; 15 (10) e220
  • 3 Flanagin AJ, Metzger MJ. Perceptions of Internet information credibility. Journal Mass Commun Q 2000; 77 (03) 515-540
  • 4 Cline RJW, Haynes KM. Consumer health information seeking on the Internet: the state of the art. Health Educ Res 2001; 16 (06) 671-692
  • 5 Pew Research Center. Technology use among seniors. 2017 Accessed December 03, 2019 at: https://www.pewresearch.org/internet/2017/05/17/technology-use-among-seniors/
  • 6 Akkermans M. Ouderen maken inhaalslag op het internet [Elderly making inroads on the internet]. 2014 Accessed August 11, 2019 at: http://www.cbs.nl/nl-NL/menu/themas/vrije-tijd-cultuur/publicaties/artikelen/archief/2011/2011-3537-wm.htm
  • 7 Department of Economic and Social Affairs Population Division. World Population Ageing 2015. 2015 Accessed August 11, 2019 at: https://www.un.org/en/development/desa/population/publications/pdf/ageing/WPA2015_Report.pdf
  • 8 Medlock S, Eslami S, Askari M. et al. Health information-seeking behavior of seniors who use the Internet: a survey. J Med Internet Res 2015; 17 (01) e10
  • 9 Henshaw H, Clark DP, Kang S, Ferguson MA. Computer skills and internet use in adults aged 50-74 years: influence of hearing difficulties. J Med Internet Res 2012; 14 (04) e113
  • 10 Thorén ES, Oberg M, Wänström G, Andersson G, Lunner T. Internet access and use in adults with hearing loss. J Med Internet Res 2013; 15 (05) e91
  • 11 Choudhury M, Dinger Z, Fichera E. The utilization of social media in the hearing aid community. Am J Audiol 2017; 26 (01) 1-9
  • 12 Go E, You KH, Jung E, Shim H. Why do we use different types of websites and assign them different levels of credibility? Structural relations among users' motives, types of websites, information credibility, and trust in the press. Comput Human Behav 2016; 54: 231-239
  • 13 Salmerón L, Fajardo I, Gómez-Puerta M. Selection and evaluation of Internet information by adults with intellectual disabilities. Eur J Spec Needs Educ 2019; 34 (04) 272-284
  • 14 Diaz JA, Griffith RA, Ng JJ, Reinert SE, Friedmann PD, Moulton AW. Patients' use of the Internet for medical information. J Gen Intern Med 2002; 17 (03) 180-185
  • 15 Clement J. Most popular social networks worldwide as of July 2019. 2019 . Accessed August 8, 2019 at: https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/
  • 16 Wong CA, Ostapovich G, Kramer-Golinkoff E, Griffis H, Asch DA, Merchant RM. How U.S. children's hospitals use social media: a mixed methods study. Healthc (Amst) 2016; 4 (01) 15-21
  • 17 Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube: a systematic review. Health Informatics J 2015; 21 (03) 173-194
  • 18 Zuzelo PR. Understandability and actionability: using the PEMAT to benefit health literacy. Holist Nurs Pract 2019; 33 (03) 191-193
  • 19 Basch CH, Yin J, Kollia B. et al. Public online information about tinnitus: a cross-sectional study of YouTube videos. Noise Health 2018; 20 (92) 1-8
  • 20 Beaunoyer E, Arsenault M, Lomanowska AM, Guitton MJ. Understanding online health information: evaluation, tools, and strategies. Patient Educ Couns 2017; 100 (02) 183-189
  • 21 Laplante-Lévesque A, Brännström KJ, Andersson G, Lunner T. Quality and readability of English-language internet information for adults with hearing impairment and their significant others. Int J Audiol 2012; 51 (08) 618-626
  • 22 Manchaiah V, Dockens AL, Flagge A. et al. Quality and readability of English-language Internet information for tinnitus. J Am Acad Audiol 2019; 30 (01) 31-40
  • 23 Agency for Healthcare Research & Quality. The Patient Education Materials Assessment Tool (PEMAT) and User's Guide: an instrument to assess the understandability and actionability of print and audiovisual patient education materials. 2013 Accessed January 18, 2020 at: https://www.ahrq.gov/professionals/prevention-chronic-care/improve/self-mgmt/pemat/index.html
  • 24 Shoemaker SJ, Wolf MS, Brach C. Development of the Patient Education Materials Assessment Tool (PEMAT): a new measure of understandability and actionability for print and audiovisual patient information. Patient Educ Couns 2014; 96 (03) 395-403
  • 25 Bellon-Harn ML, Manchaiah V, Morris L. A cross-sectional descriptive analysis of portrayal of Autism Spectrum Disorders in YouTube videos: a short report. Autism 2020; 24 (01) 263-268
  • 26 Gabarron E, Fernandez-Luque L, Armayones M, Lau AY. Identifying measures used for assessing quality of YouTube videos with patient health information: a review of current literature. Interact J Med Res 2013; 2 (01) e6
  • 27 Van den Eynde J, Crauwels A, Demaerel PG. et al. YouTube videos as a source of information about immunology for medical students: cross-sectional study. JMIR Med Educ 2019; 5 (01) e12605
  • 28 Basch CH, Menafro A, Mongiovi J, Hillyer GC, Basch CE. A content analysis of YouTube™ videos related to prostate cancer. Am J Men Health 2017; 11 (01) 154-157
  • 29 Ruppert L, Køster B, Siegert AM. et al. YouTube as a source of health information: analysis of sun protection and skin cancer prevention related issues. Dermatol Online J 2017; 23 (01) pii : 13030/qt91401264
  • 30 Chan K, Zhang T. An exploratory study on perception of celebrity endorsement in public services advertising. Int Rev Public Nonprofit Mark 2019; 16 (2–4): 195-209
  • 31 Jin SV. “Celebrity 2.0 and beyond!” Effects of Facebook profile sources on social networking advertising. Comput Human Behav 2018; 79: 154-168
  • 32 Spry A, Pappu R, Cornwell TB. Celebrity endorsement, brand credibility and brand equity. Eur J Mark 2011; 45 (06) 882-909
  • 33 Park M, Naaman M, Berger J. 2016 A data-driven study of view duration on YouTube. Proceedings of the Tenth International AAAI Conference on Web and Social Media (ICWSM 2016). Accessed December 4, 2019 at: https://arxiv.org/abs/1603.08308
  • 34 Eysenbach G, Sa ER, Diepgen TL. Shopping around the Internet today and tomorrow: towards the millennium of cybermedicine. BMJ 1999; 319: 1294