1、 ATIS-0800031 IPTV QOE MEASUREMENT RECOMMENDATIONS there is a wide range in prevalence for each reported problem. Problems with video quality and service installation were most prevalent. The wide range indicates the importance to have measurement methods that help identify these problems. Figure B.
2、3 indicates that service providers have a fairly good idea of where problems occur. It is clear that most problems are experienced in the home and access networks. Respondents indicated that in many cases they do not have sufficient diagnostics or reporting capabilities available from the home. Resp
3、ondents indicated some of the major causes of problems are in the following areas: o Outside plant noise ingress. o Home wiring integrity/connector issues. o Problems with the RG or ITF (STB). Respondents also indicated a fair amount of trouble with the content from the content providers. One respon
4、dent indicated “they are a cause of many problems” and also indicated that, once identified, his/her company would “alert” the content provider of the problem so that the content provider could fix the problem as soon as possible. Another response indicated a desire for better standards for the ingr
5、ess content coming in from content providers, for both linear and CoD content. A large percentage of respondents (90%) indicated that service quality problems are identified after customers call in. Pro-active tools are highly desirable, especially in the home environment. 77% of respondents indicat
6、ed that existing picture quality measurement solutions are not accurate. Respondents indicated customer experience (MOS) measurements are important. Not all respondents perform MPEG Transport Stream performance monitoring. A high percentage of respondents performed picture quality analysis and MPEG
7、Transport Stream monitoring. 77.7% of respondents indicated that service quality problems are the primary cause of customer churn. There is a desire for standards to offer threshold values and methodologies to establish threshold values. Although customers may experience visual degradation, if these
8、 degradations are minor, and occur infrequently (perhaps one an hour), a customer will most likely not contact the service provider. Other desires that were indicated: o Loudness variations: - Between services. - Transitioning from and to commercials. - Balancing audio levels between channels. ATIS-
9、0800031 22 o Indicators of impairments. A wide range of video and audio artifacts can be distinguished, as observed in Figure B.2. o An automated method to obtain a MOS score for a CoD asset objectively (i.e., without having to view it subjectively). Other Considerations Customers do not share their
10、 opinion in terms of network performance parameters. MOS is a statistical measure. There will be outliers i.e., customer opinions that do not match the average population. For perceptual degradation of the streamed content, customers may share additional/supplemental information about their experien
11、ce in terms of the presence of certain QoE indicators e.g., blockiness, black screen, etc. as service degradations, and the length, frequency, etc., of the QoE indicator as part of the service degradation. While such information may be shared, customers still rate the service in terms poorexcellent.
12、 Yet, these matters do relate to customer experience e.g., the longer and the more often an IPTV service impairment occurs, such as blockiness, the worse the customer experience. IPTV service providers are interested in reducing customers help desk calls related to customer experience issues. Custom
13、ers will call the help desk for a variety of reasons, but typically not to report that their service experience was excellent. In other words, the granularity of bad service e.g., “poor”, “bad”, “could be better” may not matter that much. It was bad enough for the customer to make the phone call. “S
14、ervice performance” is a unique category of measurements that relates to customer experience. This information is only useful to the IPTV service provider for monitoring and alarming, and typically involves overall service quality reporting on degraded service events. An example includes the percent
15、 of IPTV sessions experiencing X degraded service quality events per Y hours. In some cases, such service performance can be based on customer experience (i.e., MOS). TMF GB938 34 describes such quantities. But, customers will not share their experience in terms of these service performance quantiti
16、es. They will share their experience in terms of poorexcellent. Customer opinions may be influenced by other subjective criteria that are not measurable in a network. The QoE impacts of such criteria is for further study. B.1 ATIS IIF Survey B.1.1 Background All survey respondents were service provi
17、ders offering IPTV wireline services. Areas of expertise of those individuals responding included architecture development, fault management, video planning, network design, and implementation and video operations. Each respondent was considered a single response regardless of whether or not they re
18、presented the same company. The answers are the opinion of individual respondents, and do not indicate ATIS opinion. The approach used in this report was simply to report individual responses, to be as descriptive as possible to maximize potential value, and avoid getting into service provider speci
19、fics. B.1.2 Results Description avg 4.2 Network faults 1-7; avg 3.4 Headend faults that impact customers 1-7; avg 3.5 Responses to the question “Please list and describe any other trouble types not mentioned above”. Table B.2: Additional problems identified by IPTV SPs (Individual responses) Trouble
20、 Type Description Prevalence Least (1) Most (10) Outside plant noise ingress VDSL 7 Home wiring integrity connectors 7 Digital Terrestrial TV (DTT) reception quality (home environment) 9 PVR / DVR 2 ATIS-0800031 24 What trouble types seem to be particularly difficult to resolve? Very intermittent vi
21、deo freezes. In a complex system, there are many things that could cause a packet loss. We are working to get the end-to-end system with enough diagnostics so the Tier 1 staff can quickly isolate problems. Noise ingress. DTT reception and ADSL line speed issues. Sound variations between services, co
22、mmercials. Do you use measurements that calculate user-perceived quality by estimating Mean Opinion Scores (MOS), for any of the following: 50% did not respond. Troubleshooting (50%). Planning (33%). Monitoring (33%). Other (Please describe): Customer satisfaction, video quality, audio quality, vide
23、o stability, audio stability, audio sync, audio loudness between channels, etc. Testing before fielding. Please rank the following by importance 1 (lowest) to 3 (highest): Mean Opinion Scores; Responses from 1-3; average 2.5. QoE indicators, such as blockiness or audio drop out; Responses from 1-3; average 1.5. Low-level QoS metrics, such as packet loss and jitter; Responses from 1-3; average 2. Location of the Problems Please indicate where your troubles occur. Please assign a proportion of the overall troubles to each network segment.