Initial Analysis of the Initial live prototype implementation

yieldingrabbleInternet and Web Development

Dec 7, 2013 (3 years and 8 months ago)

187 views

P2P
-
Next Deliverable 8
.1.2 v1.0

Page
1

Project P2P
-
Next


Deliverable number 8.1.2






216217 P2P
-
Next

Deliverable number 8.1.2

Initial Analysis of the Initial live prototype implementation



Contractual Date of Delivery to the CEC:
30 June 2010


Date of Completion: Xth June 2010


Author(s):
J
ohnathan Ishmael, Keith Mitchell

Contribution Partners:
ULANC (editors), VTT, STM, UPB, JSI, TUD, FAB, MFG

Workpackage:
WP8

Security:
Public

Nature:
Deliverable

Version:

v1.0

Total number of pages:


Abstract:


This deliverable presents results from data
pertaining to all aspects of Living Lab Trials
carried out during Phase II of the P2P
-
Next Project. The period emcompasses M18


M30.

The results and recommendations within this document will be used to further refine the
general P2P
-
Next architectural sp
ecifications as well as the reference implemenations
NextSharePC and NextShareTV.


The document reports on the initial user studies carried out by a number of project partners in
the live Living Lab environments.



Keyword list:


WP8, Initial Prototype,

Living Lab, User Trials, NextShare NextShare
TV
, NextShare
PC





P2P
-
Next Deliverable 8
.1.2 v1.0

Page
2


Version Summary


Version Number

Produced By

Description

Date

0.1

Keith Mitchell

Initial TOC

10
th

May

0.2

Keith Mitchell

Initial ULANC Structure

19
th

May

0.3

Keith Mitchell, Johnathan
Ishm
ael

VTT Contruibution
added. VTT authors
-

Virpi Oksman , Antti
Tammela, Tuomo
Kivinen, Timo Kinnunen,
Kaisa Kujanpää, Jori
Paananen

21
st

June






























P2P
-
Next Deliverable 8
.1.2 v1.0

Page
3

Executive Summary


This document is the second deliverable for WP8 and rela
tes to Phase II of the P2P
-
Next Living Lab.
Phase II is concerned with all aspects of migrating the NextShare service to a production like
environment on which we can continually expand and develop services for. The purpose of the Living
Lab is to provid
e a platform for iterative trials and experiments of both the PC and STB platform, ‘in
the wild’. The outcome, results and analysis is to be used to refine the general P2P
-
Next architectural
as well as provide a basis for scientific publication.


The docu
ment is divided into two main topics, the first provides results related to the Technical trials
carried out to date between M18 and M30. Secondly, the document describes each of the Living Lab
trial sites in detail.


The evaluation and results for each L
iving Lab cover both logistical, technical, performance and QoE
of the overall P2P
-
Next platform (code named NextShare).


Each sub
-
section within the document provides a conclusion section.


In summary, our findings to date show that:



The initial set of
user studies using the SB show…..



The technical trials carried out demonstrate




P2P
-
Next Deliverable 8
.1.2 v1.0

Page
4



Table of Contents


2.0

Technical Trials

................................
................................
................................
................................
......

11

2.1 M24 Trial
................................
................................
................................
................................
..........

11

2.2 Far North Living Lab

................................
................................
................................
.......................

11

2.3 Closed Swarm

................................
................................
................................
................................
..

11

3.0

Living Labs

................................
................................
................................
................................
............

11

3.1

VTT

................................
................................
................................
................................
...................

12

3.1.1 User expectations towards P2P TV services

................................
................................
.................

12

3.1.2 Bac
kground information

................................
................................
................................
...............

12

3.1.3 Scenario survey

................................
................................
................................
.............................

13

3.1.4 Media diaries

................................
................................
................................
................................
.

19

3.1.5 Summary

................................
................................
................................
................................
.......

20

3.1.6 P2P Next PC Service

................................
................................
................................
.....................

20

3.1.7 The User Interface

................................
................................
................................
.........................

23

3.1.8 Content

................................
................................
................................
................................
..........

28

3.1.9 Living lab trial site

................................
................................
................................
........................

32

3.1.10 User feedback

................................
................................
................................
..............................

33

3.1.11 Video Quality Tria
l

................................
................................
................................
......................

43

3.2 ULANC

................................
................................
................................
................................
.................

50

3.2.1 Physical Infrastructures

................................
................................
................................
.................

50

3.2.2 Preparation for Lab Deployment

................................
................................
................................
...

54

3.2.3 ULANC P2P
-
Next Service Infrastructure

................................
................................
.....................

54

3.2.4 LivingLab Statistics Service
................................
................................
................................
..........

56

3.2.5 Content Ingest

................................
................................
................................
...............................

58

3.2.6 NextShare
PC

User Experience

................................
................................
................................
.......

61

3.2.7 Discovery Feed Service
................................
................................
.

Error! Bookmark not defined.

3.2.8 Fault Reporting

................................
................................
................................
.............................

63

3.2.9 Trials and Analysis

................................
................................
................................
........................

66

3.2.10 Planning:

................................
................................
................................
................................
.....

68

3.2.11 Lessons Learned

................................
................................
..........

Error! Bookmark not defined.

3.3 UPB

................................
................................
................................
................................
.......................

70

3.3.1 Trial Site

................................
................................
................................
................................
........

70

3.3.2 Available Content

................................
................................
................................
..........................

70

3.3.3 Support

................................
................................
................................
................................
..........

70

3.3.4 User base

................................
................................
................................
................................
.......

70

3.3.5 Automated Infr
astructure and Result Processing

................................
................................
..........

71

3.3.6 Trials
................................
................................
................................
................................
..............

71

3.3.7 Results

................................
................................
................................
................................
...........

72

3.3.8 Future work

................................
................................
................................
................................
...

72

3.4 JSI

................................
................................
................................
................................
..........................

73

3.4.1

Introduction

................................
................................
................................
................................
...

73

3.4.2 JSI/RTV Living Lab goals

................................
................................
................................
............

73

3.4.3 Service requirements and selection

................................
................................
...............................

73

3.4.4 Service development and deployment plan

................................
................................
...................

74

3.4.5 Service selection

................................
................................
................................
...........................

75

3.4.6 Physical infrastructure

................................
................................
................................
...................

76

3.4.7 Service development

................................
................................
................................
.....................

77


3.4.8 Client interface

................................
................................
................................
............................

100

3.4.10 Curre
nt service setup at JSI

................................
................................
................................
.......

110

3.4.11 Trials and analyses

................................
................................
................................
....................

113

3.4.12 Recommendations

................................
................................
................................
.....................

118

3.4.13 Future plans

................................
................................
................................
...............................

120

4. WP8 Living Lab Roadmap

................................
................................
................................
.........................

120

Annex I: Trial Proposals

................................
................................
................................
......................

121

5. References

................................
................................
................................
................................
..................

122




Table of Figures


Figure 1: Electronic Media Products

................................
................................
................................
.....

13

Figu
re 2: The most interesting media products

................................
................................
.....................

14

Figure 3: Foreign and Domestic Users

................................
................................
................................
..

14

Figure 4: What to Share?

................................
................................
................................
.......................

15

Figure 5 : Future TV program search

................................
................................
................................
....

16

Figure 6: Future TV program search

................................
................................
................................
.....

17

Figure 7: Scenario: Chatting on TV

................................
................................
................................
......

17

Figure 8: Payment methods

................................
................................
................................
...................

18

Figure 9: The P2P
-
Next Portal Overview

................................
................................
.............................

24

Figure 10 : The three levels of ATOM feed

................................
................................
...........................

24

Figure 11: P2P
-
Next portal displays programme guide for live television channels

............................

26

Figure 12: P2P
-
Next portal displays prog
ramme guide for on
-
demand programmes

...........................

26

Figure 13: P2P
-
Next portal displays results for search operation

................................
.........................

27

Figure 14: P2P
-
Next portal 2010 and Drupal CMS

................................
................................
..............

27

Figure 1
5: Opening page of P2P
-
Next portal.

................................
................................
.......................

28

Figure 16: Play window with “close window” and “pause” buttons. Placing mouse cursor over the
SwarmPlugin icon on the lower right displays download, upload and sharing (helping) informati
on.

29

Figure 17: On
-
Demand videos. Pointing the videos name with mouse brings up an info
-
screen.

........

30

Figure 18: NextShare TV page in Owela forum containing the instructions how to get Swarmplugin
installed and the link to the video portal.

................................
................................
..............................

31

Figure 19: Different aspects of user experience

................................
................................
....................

34

Figure 20: The amount of users per day.

................................
................................
...............................

35

Figure 21: Total watching

times per day

................................
................................
...............................

36

Figure 22:The average number of users and use minutes per day.

................................
........................

36

Figure 23: Number of users per day

................................
................................
................................
......

37

Figure 24: Total minutes viewed per day.

................................
................................
..............................

37

Figure 25: Average number of users and use minutes per day.

................................
.............................

38

Figure 26: P2P
-
Next statistics web page

................................
................................
...............................

40

Figure 27: Arrivals of SwarmPlugin peers per day

................................
................................
...............

41

Figure 28: SwarmPlugin peers in system

................................
................................
..............................

41

Figure 29: Average number of helping peers

................................
................................
........................

42

Figure 30: Average download rates

................................
................................
................................
.......

42

Figure 31: Average
upload rates

................................
................................
................................
............

43

Figure 32: Trial setup

................................
................................
................................
............................

43

Figure 33: Throughput

................................
................................
................................
...........................

44

Figure 34: : Delay

................................
................................
................................
................................
..

45

Figure 35: Throughput

................................
................................
................................
...........................

45

Figure 36: Jitter

................................
................................
................................
................................
....

46

Figure 37: Throughput

................................
................................
................................
...........................

46

Figure 38: Delay

................................
................................
................................
................................
....

47

Figure 39: A Typical ULANC Campus Room
................................
................................
.......................

51

Figure 40: Resnet N
etwork Structure

................................
................................
................................
....

52

Figure 41: A Wireless Mesh Network

................................
................................
................................
...

53

Figure 42: A Mesh Router

................................
................................
................................
.....................

53

Figure 43: Summary of the ULANC IPTV Headend Infrastructure

................................
.....................

58

Figure 44: The P2P based BBC Content

................................
................................
...............................

59

Figure 45: The P2P
-
Next Video Cache Main Menu

................................
................................
..............

61

Figure 46: The P2P
-
Next Video Cache Main Configuration page

................................
........................

61

F
igure 47: Living Lab Support Processes

................................
................................
.............................

63

Figure 48: RemoteTest Architecture

................................
................................
................................
......

69

Figure 49: Schemantic representation of JSI/RTV Living Lab

................................
.............................

77


Figure 50: Living Lab in
gest point monitoring via MRTG

................................
................................
.

108

Figure 51: JSI Living Lab site example pages

................................
................................
.....................

111

Figure 52: Sample RTV Living Lab pages

................................
................................
...........................
112

Figure 53: Part of the RTV Living Lab p
age, access to TV channel, event toolbar

.............................
112

Figure 54: Ingest point CPU usage

................................
................................
................................
.......
113

Figure 55: Cross channel peer count (week)

................................
................................
........................
113

Figure 56: RTVSLO2 peer counter (day
)

................................
................................
.............................
114

Figure 57: All peer counter (year)

................................
................................
................................
........
114

Figure 58: Ingest host memory usage (month)

................................
................................
.....................
115

Figure 59: Ingest host load (month)

................................
................................
................................
.....
115

Fig
ure 60: Octoshape and NextShare

................................
................................
................................
...
116





Index of Tables

Table 1: Open questions classified into 9 topics.
................................
................................
...................

15

Table 2: Delivery of STBs Actual Vs Planned

................................
................................
......................

50

Table 3: An overview of tickets raised in relation to STB operation
................................
.....................

67


1.0

Introduction


This deliverable det
ails the initial results relating to the evaluation of the P2P
-
Next platform between
the period M18 and M30. The platform developed as part of the project is divided into the PC client
and the digital set
-
top
-
box (STB) versions and are named NextShare
PC

an
d NextShare
TV

respectively.
These terms will be used throughout the remainder of this document.


This document divides the high level structure into describing the two primary types of trials being
carried out within P2P
-
Next.




Technical Trials
: This
kind of trial we will expect to carry out as an open internet trial. This
mean
s
that some publicity may occur in order to attract users to the trial site. Form here they
can download

the software and take part in a trial. This type of trial is expected
to be short
lived and of limited

time span. For example, in July 2008 (M7), a trial was conducted in order
to gain some initial

results pertaining to the live streaming P2P engine. This trial was
advertised through the media

(BBC Technology News Page and

Slashdot) and a large number
of users downloaded and used the

software for a limited period. These types of trials are
targeting the PC platform and Internet based

users. Each trial offers the trialists to sign up for
future trials, but this is an option
.





User Trials
: This kind of trial we will carry out longer term studies and developer a closer

relationship with the trialists, both on the PC and STB environments. Here the aim is to
deploy and

test the software system on a more production like basis

(I.e. 24/7/365). here we
aim to discover a

deeper understanding of the P2P
-
Next approach .


In section 2 we describe the technical trials which have taken place.....


In section 3 we describe each Living Lab trial site


Section 4


Roadmap for future tri
als


During the lifetime of the project we envision a number of user trials. These fall into the following
broad


categories:











2.0

Technical Trials


During the period M18


M30 there has been the following Technical Trials



M17



M24



Closed Swarm


internal trial



Far North Living Lab Stunts


2.1 M24 Trial

2.2 Far North Living Lab

2.3 Closed Swarm


3.0

Living Labs


The Living Labs have dependencies with nearly all other work packages in the project, it requires
features to be developed, integrated and

tested prior to trials being conducted. In order to track the
dependencies on WP8, the project maintains a roadmap of features, their current status and expected
time at which they are to be made available to the LivingLabs for testing.

Based upon the ro
ad map as of M30 there are currently three features which are available to WP8 that
have been certified as passing Quality assurance standards of the down
-
stream workpackages
(WP6/WP7). These three features are:



Live streaming of video content from withi
n the P2P
-
Next Core



A web portal for viewing content, this renders EPG data based upon the WP5 meta
-
data
standard



Integration of NextShareCore with a web browser plugin.

It is these three features which provide the foundation for tests within the Living

Labs. For a full list
of anticipated features, associated RoadMap and future time
-
line see Appendix I.


3.1

VTT

3.1.1 User expectations towards P2P TV services

3.1.2 Background information


VTT Living Lab focuses on user
-
centered, open innovation researc
h and development of P2P based
multimedia services. Our main focus is to develop and create multimedia services and concepts in co
-
operation with users and consumers. The approach relates to concepts such as user innovation and
human
-
centred design. The id
ea behind user innovation is that at the site of implementation, many
products and services could be developed or at least refined, by users (Hippel et al. 1986).

The human
-
centred design process consists of successive cycles of specifying different use co
ntexts, user
requirements, producing design solutions, evaluating the solutions against the requirements, and
refining the context of use (Kaasinen 2005). In VTT Living Lab user feedback is collected with
several methods to evaluate the quality of user exp
erience and to guide the design of NextShare PC.



Having a clear understanding of user requirements and the factors that shape the acceptance of P2P
services has a lot of benefits. In principle benefits include lower developing costs, shorter developing
time, lower maintenance costs, longer product life cycle, a stronger brand and more satisfied clients. It
is suggested that product quality will also be better because the product corresponds to the needs of the
users. (Usinacts, 2000).



The quality of us
er experience is
also a huge competition factor.

The service or device has also to
support user experience, which is not possible without learning from end users. Ideally, when a
product has been developed based on users' requirements, the product also mee
ts the need that it was
supposed to and then there is a real market for the product.



In our earlier report (D8.1.1.) we
have used different empirical methods such as focus groups and
surveys in order to understand users’ attitudes towards some P2P TV fea
tures so that we could provide
a valuable resource for design. In the earlier report, we found out that users emphasized the easiness of
use. User
-
friendliness was regarded as one of the most important characteristics of NextShare PC or
any other media ser
vice. Users also appreciated freedom of choice, functionality and picture quality.
Not surprisingly, users also wished that there should not be too many advertisements in this kind of
service. There was also some concern regarding data security issues of P
2P services. Users were
especially concerned about their personal or payment information spreading to wrong hands.



To understand

emerging trends on IPTV
-
services use and the quality of user experience related to
these services, we have conducted more ext
ensive research on user expectations towards P2P TV
services, which we will present in this deliverable
.

We combined scenario survey with media diaries
in order to understand some features of the circumstances at work in our participant’s current
use of
T
V and so that we might better understand users’ needs and attitudes towards the future P2P TV
services. Users with a wide range of different demographic backgrounds including students, families
and senior citizens participated in this study. With scenario
survey we collected information about the
most interesting media contents, social media features, program search options and preferred payment
systems.


In addition, as the good video quality was seen as a one of the most important characteristics of P2P
service, we did some testing in order to determine the network parameter limits for tolerable video
quality. The effect of decreasing network quality on NextShare live streaming video quality was
measured in order to determine the network parameter limits
for tolerable video quality.




The first VTT living lab test with P2P Next TV service in
evolving real life contexts
started 15th of
April 2010.
The purpose of the test is to evaluate the P2P Next solution and its functioning against the
user feedback in t
he real everyday life use situations. In this deliverable, we present the results and the
challenges from Living lab tests as well as the as the technical work of designing the P2P Next PC
Service and the statistics of peers in the system.

3.1.3 Scenario
survey

The scenario survey consisted of visualizations of different P2P TV use situations and a questionnaire
concerning the qualities or the functionalities mentioned in the scenario. 100 users responded to the
survey This deliverable details the question
naire. The respondent’s age range was from 19 to 63 years.
Their mean age was 29.


Scenarios are personalized fictive stories which concretize the usage of services or devices to users,
when the actual product is not yet available for testing. Scenarios he
lp users to understand what is
meant with different interactive qualities and functionalities.


According to Nielsen (1993) scenarios are descriptions of a single user or a group of users who is
using a device or a service in order to achieve a certain res
ult in certain circumstances and within
certain time period. Scenarios are simplified descriptions without technical details and they are written
from the user's point of view. Moreover, scenarios can also be utilized to describe how future devices
could b
e used. They are best suited for early stages of development especially for the requirement
specification.


Electronic media products


According to the scenario survey, on
-
line newspaper is the most popular electronic media product.
Internet radio, Mp3 pla
yer, and TV
-
programs in internet are also important. Mobile television,
electronic books, and internet books are not considered to be important yet.


Electronic media products
6,19
5,86
2,17
3,96
3,88
7,29
6,24
0
1
2
3
4
5
6
7
8
MP3
TV programs
in internet
Mobile
television
Books in
internet
Electronic
books
Newspapers in
internet
Inernet radio
Iportance (scale 1-10)

Figure
1
: Electronic Media Products

Media formats


The most important media fo
rmats were the latest news, music and news archive. Books, radio shows
and magazines didn’t interest users. Also advertisements (Product information) didn’t attract the users.



Electronic media products
6,19
5,86
2,17
3,96
3,88
7,29
6,24
0
1
2
3
4
5
6
7
8
MP3
TV programs
in internet
Mobile
television
Books in
internet
Electronic
books
Newspapers in
internet
Inernet radio
Iportance (scale 1-10)

Figure
2
: The most interesting media products



Fore
ign news

As an average Finnish users were eager to view news from foreign countries. A user should be able to
view a range of varied content and have an easier access to foreign programmes.



Easier access to foreign programmes and channels also outside ca
ble network and without satellite
dish.”



More productions from different countries and small production companies, and interesting,
valuable new content innovations, not only western mass
-

production. ”


In turn, also local content raised interest among

participants.


Hyper
-
local news and information from the neighbourhood communities.”


Foreign news
7,14
5,02
5,32
5,98
0
1
2
3
4
5
6
7
8
Foreign news
Foreign news every day or
weekly
Domestic news is enough
If something special happens'
Importance (scale 1-10)

Figure
3
: Foreign and Domestic Users



To share media



To share media
6,81
3,97
4,04
2,76
3,88
2,59
4,66
2,93
0
1
2
3
4
5
6
7
8
Photo
Video
Music
Movie
News videos
Tv series
YouTube
Radio shows
How important (scale 1-10)

Figure
4
: What to Share?


When asked, what kind of media co
ntent users would like to share, personal photos and videos with
friends and family members raised interest among most users. The participants are already used to
share them efficiently in social media such as Facebook and MySpace.



To Share situations, V
ideo topics


When we asked the users about their motivation to share pictures or videos three main topics emerged:
journey, special events and family (table).



I could share a picture from a long journey or from a party that I've participated.”


Family
is important for me. I could share wedding pictures or videos of my son.”



Table
1
:
Open questions classified into 9 topics.


To inform
others

Funny
interesting
stories

Friend and
long
distance

Journey

Friends
in
Pictures

Special
events,

Party

Other,

Because I
Promised

Never
share

Family, Pets,

Birthdays,

Everyday

Users

3

16

3

30

14

33

4

1

32



Telling about funny situations and stories were also a common motivation to share pictures or videos.
“If I find something funny from the

Internet, I would like so send it to my friends”.


People mentioned also many other situations and things that they like to share. For example if friends
are far away, it's a good to share everyday situations with pictures or videos.
“My grandchild is an
exchange student in New Zealand. I send videos from topics that are important for her. I've sent videos

from her dog and from a family holiday”.

It's also nice to tell a friend from important events.
“There
was this very important campaign video which I se
nt immediately to others”


Social television browsing


According to study users are willing to use different browsing methods to browse TV
-
programs. All
methods were assessed to about the same importance level.

Systems of social recommendation and
persona
l interest were regarded as important search criteria alongside the traditional channel
-
based
search methods.



Future television program search
6,99
6,76
6,88
6,51
6,59
6,2
6,3
6,4
6,5
6,6
6,7
6,8
6,9
7
7,1
TV channels
Recommendations
Personal interest
Search
Category
How important (scale 1-10)

Figure
5

:
Future TV program search

Social media and television


Users thought that Facebook and other that kind of
social media products are easiest to use with
computers. According to this study they were not at the moment very interested to use them on mobile
phones and on their regular television sets. This will be an important subject of study later, as iPhones,
wh
ich make the Internet and Facebook use much easier have become more popular in a short time
period also in Finland.



Facebook
3,86
7,27
2,72
1,93
0
1
2
3
4
5
6
7
8
Mobile phone
PC
Television
Radio shows
How importan (scale 1-10)

Figure
6
:
Future TV program search


To chat and view television at the same time


Quite surprisingly, users were

not willing to chat with their friends and view television programs at the
same time. As an average their thought the importance of this feature was only 3,4 (Scale 1
-
10). The
concept might have been too complicated to assess, because the users are viewin
g television and
sending text messages at the same time already. In fact, many thought that they would like to centre
their entire attention on the program they are watching. They would prefer other, more discrete forms
of interactivity for TV, such as lea
ving text comments which friends could see later.



It would be nice to know that if your friends are following the same programs, you could leave some
comments to your friends regarding the episode.”


At the moment, it’s mainly the mobile phone which mak
es the interactivity of regular TV possible in
Finland. It is possible to
comment on some current affairs programs with text messages and the
messages then appear on the TV screen in public broadcast. Chatting on TV is possible with the
mobile phone. In fa
ct, users suggested that the future P2P TV system should have a
usable and
technically functional integration into the mobile phone
.





Figure
7
: Scenario: Chatting on TV





Payment methods


Users were willing to pay a fixed pric
e from this kind of peer to peer service. However, there were
many differing opinions and it is important to give users different options. Some people just want to
subscribe to their favourite series, others pay for all at once.




I don’t like to finance
others people’s TV watching so pay per view. ”



Fixed monthly or annual fee, limited access to all contents”



Some kind of pre
-
paid card. It could be used to access to all channels. You could spend the saldo for
the time used for watching.”



Complete
ly sponsored by commercials.


The current TV licence is useless.”



A good trial offer to all contents.”



60% sponsored by commercials, 40% pay what you view.”





With a relatively low price (like 100 e/year) you could download all possible TV
-
contents

to your
computer. Like all the best American TV series by the time they are broadcast in the US.”



Paying per single view sounds not like a very convenient solution, and I would not use it otherwise
than in special cases. For instance if there is a movie
, which I must see.





Payment methods
6,89
5,38
5,46
5,38
6,21
5,24
4,83
0
1
2
3
4
5
6
7
8
Fixed fee - no
limitations
Pay per view
Fixed fee -
access to a
delimitted
content
Channel fee
Serie or
production
season fee
Regularly
repeated
program fee
Tv fee
Importance (scale 1-10)

Figure
8
:
Payment methods






3.1.4 Media diaries


In order to find out how users perceive future P2P TV services as a part of their changing every day
life media usage contexts and what do they expect fro
m them, in addition to survey we have gathered
some ethnographic research materials, e.g. media diaries combined with some photographic activities.



In Finland ten households with a total twenty
-
two household members were asked to keep media
diaries of t
heir IPTV and other television viewing for one week. Nine of the participants were men and
thirteen of the participants were women. The participants’ age ranged form one to seventy years of
age. Parents filled in the diaries on of small children. Majority
of the adult participants were highly
educated (university or polytechnic).



We included different kinds of household types from families with children to single and retired
persons in our study. The participants were asked to write down the time they wat
ched TV, the
program they watched and to comment on the watching situation. We wanted to know what positive
and negative sides there were in different watching situations in order to get insights and details how
to develop certain IPTV requirements and qua
lities to meet the user acceptance.



The media diary studies provided findings from users' current media use and television watching
exploring the meanings people associated with TV in their everyday lives and unpacking some details
of how IPTV might impa
ct on the social situations, the rhythms and routines in which TV is viewed.
Younger users already chatted on their laptop computers and watched TV at the same time. More and
more users are already using Internet and TV together as the activity of viewing
TV programs
converges with other on
-
line activities.

Television is just one of the screens at home. It could be
handy to for example manage your banking account on television
”, said a 21
-
year old man.



At the same time we noted a trend of the impact the
changing experience of TV viewing on the
everyday organization of domestic life of a movement away from traditional family viewing towards
more individualized arrangements.


According to media diaries there are differences how
much people actually watch TV

together and
how important they do consider the shared experience of TV watching. Family conditions, household
size and life situation have influence on how TV is watched and how many possibilities there are to
watch TV together. For instance, a 70
-
year o
ld woman, who usually watches TV alone, writes in her
media diary:



I got to manage the remote control how ever I pleased all day. It’s good that I don’t have to take
account of others but it’s not good that I can’t exchange opinions with others while wat
ching.”



One of our media diary study participants didn’t have a TV at home at all and she only used her laptop
to stream TV programs on
-
line. She follows certain TV series regularly but she is not tied to the
broadcasting times. However sometimes her PC
might not support the technology that is used on the
video streaming services. She considered the viewing conditions rather uncomfortable due to several
user experience related factors but she still wouldn’t like to purchase her own TV.




This time I was
annoyed by the bad quality of the image. When the image is scaled to full
-
screen it
becomes blurry and the small image is sometimes annoying to watch. Maybe I should finally purchase
a television. But then the TV would be on all the time and I wouldn’t get

anything done. “


The participants appreciated the freedom to be able to choose what and when they watch instead of
being tied to the broadcasting times. Several people programmed their receivers to save certain TV
programs and then they watch those recor
dings at a time most suitable for them. It is common to skip
and fast
-
forward the commercials but some participants fast
-
forwarded parts of the TV show that they
considered uninteresting.


3.1.5 Summary

The future P2P TV should combine both familiar and n
ovel features, which support choice, control,
interaction and quality watching experience. Being able to pause, fast
-
forward and skip parts of the
program as they wish is essential regarding the users’ experience of the service. This obviously creates
chal
lenges for advertising. The quality of image is also an important factor in the whole user
experience. The ideal P2P service should offer users

a range of varied content, both broadcast and
used generated types of content and even new kinds of content genr
es and production types. The
payment system should be quite simple to handle; for instance a fixed price per month. However, it is
important to give users different payment options. Users suggested that the future P2P TV system
should have a
usable and tec
hnically functional integration into the mobile phone
.
In the next Living
lab trials this year it will be also crucial to study different payment methods and test mobile payment
systems.


3.1.6 P2P Next PC Service

3.1.6.
1
VTT Content Ingest Trial

VTT carr
ied out P2P
-
next trial during April 2010 in Finland. The content used was provided by Suomi
TV (
http://www.suomitv.fi/
), Fabchannel and ääni ja vimma festival
(
http://www.nk.hel.fi/aanijavimma/index.php
). VTT signed contracts with the bands and content
providers who allowed VTT to use the content via P2P network, therefore the copyright issues were
solved.


Linear TV program and VOD content were ingested. How
ever our main focus was on ingesting time
-
based scheduled TV program and hoped to find out whether or P2P
-
next platform is suitable for
delivering normal linear television program, how flexible p2p is, what user’s experience and feedback
are, and how serio
us network bottleneck is.


The following table shows the original content information.


Content Provider

Content

SD/HD

File Format

codec

bit rate

Suomi TV

news

SD,HD

MOV

SD

AVC/AAC

52.3Mbps

HD

XD5C/PCM

1626Kbps

FabChannel

music video

SD,HD

MOV

SD

DV
/PCM

30.3Mbps

HD

HDV5/PCM

19.9Mbps

Ääni ja Vimma
Festival

music video

SD

MPG

MPEG video/audio

7346kbps


All content sent to VTT were files. The duration of a file from “Ääni ja Vimma” festival was about
three and half hours. All files were re
-
encode
d by QuickTime Pro with mpeg add
-
on. At first, we tried
using ffmpeg to encode the files and it was too slow to encode large files.


Each file for ingesting was re
-
encoded in three bite rates: 540kbps, 1400kbps and 4000kbps. The file
format is MOV. The f
iles for linear TV programs were further transcoded to TS file format by VLC.
No live transcoding was needed because we were using files. The A/V codec was H.264/AAC.


The following table lists the encoding parameters in a file at average bit rate 4Mbps.


General

Complete name : F:
\
aanivimma
\
aani_ja_vimma_230410_4M.mov

Format : MPEG
-
4

Format profile : QuickTime

Codec ID : qt


File size

: 6.53 GiB

Duration : 3h 43mn

Overall bit rate : 4 189 Kbps

Encoded date : UTC 2010
-
04
-
26 19:42:46

Tagged date : UTC 2010
-
04
-
26 19:46:43

Writing
library : Apple QuickTime


Video

ID : 2

Format : AVC

Format/Info : Advanced Video Codec

Format profile : Main@L3
.1

Format settings, CABAC : No

Format settings, ReFrames : 2 frames

Codec ID : avc1

Codec ID/Info : Advanced Video Coding

Duration : 3h 43mn

Bit rate mode

: Variable

Bit rate : 3 994 Kbps

Width : 1 024 pixels

Height : 576 pixels

Display aspect ratio : 16:9

Frame rate mode

: Constant

Frame rate : 25.000 fps

Resolution : 24 bits

Colorimetry : 4:2:0

Scan type : Progressive

Bits/(Pixel*Frame) : 0.271

Str
eam size : 6.23 GiB (95%)

Title : Apple alias tiedon käsittelijä

Encoded date : UTC 2010
-
04
-
26 09:56:55

Tagged date : UTC 2010
-
04
-
26 19:46:43

colour
_primaries : BT.709
-
5, BT.1361, IEC 61966
-
2
-
4, SMPTE RP177

transfer_characteristics : BT.709
-
5, BT.1361

matrix_coefficients : BT.709
-
5, BT.1361, IEC 61966
-
2
-
4 709, SMPTE RP177


Audio

ID

: 1

Format : AAC

Format/Info : Advanced Audio Codec

Format version : Version 4

Format profile : LC

Format settings, SBR : No

Codec ID

: 40

Duration : 3h 43mn

Bit rate mode : Constant

Bit rate : 192 Kbps

Channel(s) : 2 channels

Channel positions

: L R

Sampling rate : 48.0 KHz

Resolution : 16 bits

Stream size : 303 MiB (5%)

Title : Applen äänimedian käsittelijä / Apple alias tiedon käsittelij
ä

Encoded date : UTC 2010
-
04
-
26 09:56:55

Tagged date : UTC 2010
-
04
-
26 19:46:43



3.1.6.
2
Geographical locations and distribution of servers


Ingesting servers were located in ESPOO and OULU (see map).




A private

dedicated tracker was used for posting VOD files. Each file was seeded by one seeder server
and two cache servers which allow overlapping torrents. The purpose of this configuration was to
maximize the number of
distributed copies.


The seeding tools used

in this trial were M24.1 source code from WP4.


Linear TV program seeding

A command line example in a BAT file is as follows:


python.exe BaseLib
\
Tools
\
createlivestream.py
--
name aani_vimma_4M.ts
--
source
d:
\
SuomiTV
\
aani_ja_vimma_190410_4M.ts
--
fileloop

true
--
destdir D:
\
Temp
--
bitrate 536832
--
port 6879


Windows task scheduler was used for scheduled time
-
based linear TV events. The code
createlivestream.py was not suitable for linear TV program ingestion as startup time is critical for
linear TV progra
m, and also it could not stopped by task scheduler and it must be stopped manually.


VOD file seeding

A command line example for creating torrent in a shell file is as follows:



#bin/bash!


PYTHONPATH="$PYTHONPATH":/home/peng/Next
-
Share/:.

export PYTHONPA
TH


python BaseLib/Tools/createtorrent.py
--
source
/home/peng/Videos/suomitv/KOHTI_UUTISIA.mov
--
tracker

http://130.188.225.88:6969/announce
--
duration 00:08:51


A
command line

example for seeding a torrent in a shell file is as follows
:


#bin/bash!


PYT
HONPATH="$PYTHONPATH":/home/peng/Next
-
Share/:.

export PYTHONPATH


python BaseLib/Tools/dirtrackerseeder.py
-
p 16881 /home/peng/Videos/suomitv/


Sache: Oversi servers

The following picture displays the torrents running on one Oversi cache server:


3.1.7
The User Interface


3.1.7.1 Overview



The P2P
-
Next portal user interface is an Ajax implementation to fetch, parse and display the content
of Atom feed. It is implemented with plain JavaScript, CSS and HTML, and it uses SwarmPlugin to
play embedded torre
nt files. Target was to support all major browsers, but eventually we had to flag
Microsoft Internet Explorer out from our trial (lack of usable DOM methods when scripting the

<object>
-
element). Following figure illustrates one example where ATOM feed come
s from BBC
server.






SERVER
(VTT)


XML ATOM feed

PC Browser
displays P2P
-
Next
p
ortal
page (
HTML, CSS, JavaScript
)

Internet

SERVER
(BBC)

HTML + CSS + JavaScript
+ XML ATOM feed


Figure
9
: The P2P
-
Next Portal Overview

3.1.7.
2
. ATOM feed



The P2P
-
Next Portal user interface displays a set of categories, channel
-

and programme
-
information
based on ATOM feed. T
he ATOM feed contains three levels of information: 1) channels, 2)
programmes, and 3) programme metadata.



Metadata


Channels

ch
annel
1

ch
annel
2

ch
annel
3



Programme
information



progr
amme
1

progr
amme
2

progr
amme
3

progr
amme
4



metadata for
programme1


Figure
10

:
The three levels of ATOM feed






The ATOM feed used in our trial is a subset of BBC
’s P2P
-
Next ATOM feed. Some elements we left
out, because they were not needed. Here is a list of all used elements:

1
st

phase:

Channel level (4 elements: title, link
-
>href, category
-
>term, p2pnext:image
-
>src)

<entry>



<title>Aani ja Vimma 512Kbps</title>



<link type="application/atom+xml"
href="http://visulab.../feed/aanivimma
-
512k.xml" />



<category term="tv" scheme="urn:service
-
type" />



<p2pnext:image
src="http://visulab.../images/aani_ja_vimma_perus.png" />

</entry>



2
nd

phase:

Programme informati
on level (1 element: link
-
>href)

<entry>



<link type="application/xml"
href="http://visulab.../feed/aani_ja_vimma_512k.xml" />

</entry>

3
rd

phase:

Mpeg
-
7 metadata level (7 elements: Title, TitleImage, FreeTextAnnotation, MediaLocator,
MediaDuration, Avail
abilityPeriod
-
>type, TimePoint)

<Title type="main" xml:lang="fi">Ääni ja Vimma 2010</Title>

<TitleImage><MediaUri>http://visulab.../images/aani_ja_vimma_broadca
st.jpg</MediaUri></TitleImage>

<FreeTextAnnotation>Ääni ja Vimma 2010
-
bändikatselmus
kulttuuria
reena Gloriassa.</FreeTextAnnotation>

<MediaLocator><MediaUri>http://130.188.../aani_vimma_512K.ts.tstream
</MediaUri></MediaLocator>

<MediaDuration>PT3H15M0S</MediaDuration>

<AvailabilityPeriod

type="live">



<TimePoint>2010
-
04
-
20T10:00:00+03:00</TimePoint
>

</AvailabilityPeriod>



3.1.7.3 Displaying the Interactive Program Guide (IPG)



The JavaScript code displays interactively programme information based on user selections. All
channel and programme information is now stored in JavaScript arrays and it’s
processing does not

involve any HTTP calls, except all picture data is fetched on
-
line, browser cache will store those to
fasten browsing experience.




Figure
11
:

P2P
-
Next portal displays programme guide for live television chann
els

We use three simulated live broadcast channels for “TV”
-
category: basic quality 512 kbps
(PERUSLAATU), good quality 1.4 Mbps (HYVÄLAATU) and best quality 4
Mbps(PARASLAATU).


In our trial we used video content from


“Sound and fury”(“Ääni ja
vimma”)
-
band review. Daily broadcasts were created from previous evenings concerts and on
-
demand material was clipped and encoded as soon as it was possible.



Figure
12
:
P2P
-
Next portal displays programme guide for on
-
demand programmes






Figure
13
:
P2P
-
Next portal displays results for search operation

3.1.7.4 The next step, Portal 2010



The Ajax approach used here, is valid when users want to only view programmes. When more
functionality is requested from port
al, such as user accounts and publishing of users own videos in
addition to professional content providers etc. then a real content management system must be used.
We decided to use
Drupal

CMS framework to create a new portal, where users are able to view
public
content without registering and also register a new account to create and administer their own content.
This work has started during spring 2010 and continues till the end of the year.



Figure
14
:
P2P
-
Next portal 2010 and
Drupal CMS


3.1.8 Content

During the test there were two types of streaming content available. There was the television channel
side that was streaming the newest acquired material and then the On
-
demand side which had all the
material archived (except for
the newest material). The TV
-
channel side acted like live television so
when the viewer opened the stream, it started showing the ongoing program from the point it currently
was broadcast according to the . On
-
demand streams started always from the beginni
ng of the
program.


TV material was shot during the “Ääni ja Vimma 2010” festival
-

a band contest for 15 to 25 year
-
old
young musicians held in April 2010 in culture Arena Gloria in Helsinki. In total 15 band performances
were shot lasting about
15
minute
s each.


Other TV content used in the test was SuomiTV channel (a new TV channel focusing on family
content), StadiTV (a local TV channel in Helsinki focusing on culture and events) and FabChannel
(live music).

3.1.8.1 Service functions

P2P
-
Next portal
included the following functions:


Television



On
-
demand



Search



Feedback



Quality
-
settings (for Television broadcasts, not on
-
demand)



Program guide


Opening the portal brings up the Television program guide displaying the currently broadcast program
and a pr
ogram guide displaying the programs for the morning, day and evening for the current day.

The Television
-
function was made similar to viewing traditional TV
-
broadcast. The user selects the
quality in which (s)he wants to view the broadcast (top right corn
er,
basic quality, good quality

and
best quality
) and then “watch” after which a separate video window opens and starts streaming the
video. In the beginning of streaming the SwarmPlugin add
-
on which handles the peer
-
to
-
peer traffic is
also started showing

up in the windows taskbar.



Figure
15
:
Opening page of P2P
-
Next portal.





Figure
16
:
Play window with “close window” and “pause” buttons. Placing mouse cursor
over the SwarmPlugin icon on the lower rig
ht displays download, upload and sharing
(helping) information.




Figure
17
:
On
-
Demand videos. Pointing the videos name with mouse brings up an info
-
screen.

3.1.8
.2 Use requirements

P2P
-
Next required the installation of SwarmPlug
in add
-
on which handled the peer
-
to
-
peer connections
transferring the video streams. Also the Swarmplugin only works on Windows
-
based machines
running Firefox or Chrome browsers. A basic PC was sufficient to view the videos.


3.1.8.3 Use deployment

To view

the P2P
-
Next videos the users first had to register to P2P
-
Next Owela site. After registration
the instructions how to install the plugin as well as user name and password were presented with the
link to the service.




Figure
18
:

NextShare TV page in Owela forum containing the instructions how to get
Swarmplugin installed and the link to the video portal.

After downloading the plugin the users had to install it and restart their browser. Nothing further
configuration was necessary

to view the videos (unless there was a firewall active which denied the
plugin to access the Internet).


3.1.8.4

The service performance

In the beginning of the test period one user reported that he had crashed his computer (Firefox) by
downloading the Sw
armPlugin. The same user had also problems with Chrome but finally got them
working. Some users reported similar problems but usually restarting the computer solved the
problems (unless it was a network problem).


3.1.8.5 Content Ingest


Through this trial
,
we concluded that P2P
-
next platform can provide a reliable mechanism for
ingestion of time
-
based TV program as well as VOD content via the Internet. Peer players can

constant
ly

access to
streamed
torrent data
. P2P
-
next platform is

suita
ble for large cont
ent providers.


During this trial, w
e
also
gathered the following results:


3.1.8.6

Copyright issues

Content providers worried and wondered what would happen to their content once their content was

ingested to the open Internet. Music video had extremely c
opyright problems. In our case, we had to
ask permissions from all the bands and their members.

3.1.8.7
Content preparation

Time spent on ftp file transferring, coping, encoding and transcoding were tremendous. Especially
when encoding, two computers (one

network computer) were doing encoding during the nights for
“Ääni ja Vimma” program which lasted six days.

3.1.8.8
Actual storage


Large disk spaces were
requir
ed for storage original content as well as r
e
-
encoded and transcoded
files.
The

seeder server
s
delive
red torrents from the hard disks and
they
demanded more hard disk
capacity as all program events were scheduled via task scheduler and the files were prepared for 12
hours

a day
.

O
rganizing
files in efficient way,
optimal
locality

of content

storage
systems

and creating
torrents

were important issues.

3.1.8.9
Piece size calculation

Small piece size (32K bytes) was used for seeding time
-
based linear TV program. Bigger piece size
(128K bytes) was used for VOD torrents. The following table shows the size
s of VOD torrents with
different piece sizes for a video file with 6.53 GB and duration three hours and 33 minutes.

3.1.9 Living lab trial site

3.1.9.1 Owela

The test users were recruited mainly through VTT's Open web lab Owela.
Owela is an online
labora
tory that utilizes social media features for participatory design and open innovation. The tests in
Owela can be done both in public or as cl
osed tests, with invited users.


We chose to invite users, because we wanted to have detailed demographics, which w
e can then trace
back to the gathering of individualized log data statistics. The invitations to participate into trial were
first sent to users with different profiles (age, gender, occupation..) which are in the user data base of
Owela.


We also sent in
vitations to two different user groups, which might be interested in participating in the
trial for different reasons. We assumed that the young musicians in the band contest “Ääni ja vimma”
and their friends and families would be interested in to see, com
ment and also download the videos for
themselves, as they were not available anywhere else. However, in the early stages of the trial, we
found out that there was another portal offering the same content without registration, and
downloading of the plugin,

so we were not able to get many participants from that group.

Another group of users, which we invited, were from a net community called Muropaketti. These
group of people consist mainly of open source
-
activists, whose technical expertise is high.

Altoge
ther 91 participants used the P2P service during the test period. Finally 32 of the participants
filled in a feedback que
stionnaire. 10 persons participated in the Owela discussion groups
.


3.1.9.2 Methods for evaluating user feedback

We collected and eva
luated user feedback and the quality of user experience with different, both
qualitative and quantitative methods. At first, in the NextShare adoption phase, we provided the users
the opportunity to send immediate feedback and discuss the problems in our o
pen web lab, called
Owela.



When the users tried the service first time, they were asked to fill in a questionnaire in which they can
express their first impressions of the service and evaluate their user experience. The users were asked
to evaluate for in
stance how easy it was to adopt the service, how did they experience the video
quality, how logical was the user interface, how much did errors/disturbaces during the watching
affect the experience, how interesting did they find the service and content in
general etc.


This evaluation was given with a scale from 1
-
10. Then the users were also asked to answer to some
open questions like how useful did they find this service and would they see themselves as using this
kind of service in the future.


The log
data collection system was also designed to follow the use of the service.
The appropriate
design of log functions for the evaluations is important and should be done in parallel with the
technical development because it may not be easy to add the log func
tionality to a ready
-
made system.
Consideration should be given to what data is needed and how the data could be automatically
collected and converted into a form that is easy to analyze during the evaluation (Kaasinen 2005, p.
150).


We collected log data

to reveal the time and duration of actual occurrences of P2P service use. Log
data gives specific answers to questions such as:
How much the user was using the service during
different days of the testing period? What times of the day was she or he using
the service? During
which weekdays was the user using the service? How did the use change during test period?

What kind of content was viewed? (Channels, c
ategories, media types etc.)
How long was the content
viewed? The log data on the service use was col
lected from the P2P service prototype use and then
analyzed statistically.


3.1.10 User feedback


First impressions

User's thought that it was relatively easy start to use the service (picture). Users rated it as 6,6 when
the scale was fro 1 to 10. But acc
ording to the discussion area in Owela there were also some problems
with deployment (see later). From that reason as an overall conclusion the deployment should be made
easier for the next versions.


Service's easy to use was rated 5,9. The calrity of th
e user interface was about the same. Service had
some problems during the use and therefore malfunction was rated only 4.9. Service interaction
functions were rated only 4.3. Overall service usability factors could be better.


Video quality was rated 5,2.
In the future as an entertainment service the video quality must be better.
The users were not satisfied with the content. Content was rated 4.8. and informativeness of the
service 5,1. The whole attractiveness of the service was rated 6,4 which is however

a little bit better.


6,59
5,19
5,88
5,88
5,56
4,94
5,09
4,16
6,41
4,76
4,31
0
1
2
3
4
5
6
7
Deployment
Video quality
User interface clarity
Easy to use
User interface logicality
Service malfuncionality
Service informative
Service multifunctionality
Service attractiveness
Content
Service interaction

Figure
19
:
Different aspects of user experience

Usefulness

According to user the service was useful (picture). 75% of the attendees thought that the service
usefulness was from 7 to 10, when the scale was fr
om 1 to 10. 15,7% of the users experienced that the
usefulness was from 2 to 6. 32 users attended the query. 31 of them were men.


Deployment

“Taking into use was easy at least for me when you know what to do and where. Maybe for the ones
that don't use co
mputer so often, for example, the plug
-
in installation can cause trouble.”

“The plugin wouldn't work at start but after a couple of reboots and re
-
installations etc.”

“I managed to install the required software easily and everything worked fast and fine. I

managed to
avoid the problems in the portal according to other users' comments (I knew where to look for and
what to click). Still I could not view any videos because I was missing a plugin. I tried to find one but
Firefox didn't find any.”

“Taking into u
se was OK when Chrome was already there and loading the plugin was easy. There
could've been some information from the plugin itself, what is it for etc.”


Video quality

“The best quality is a joke at the moment, you can see unpleasant bars that are notic
eable on any
display.”

“Quite raw still. With the content available (Ääni ja vimma) I'm not going to open the service a
second time... Lowest quality was really bad, good OK but still stuttered with 12Mb connection, Best
quality didn't basically

even work
at all.”


Content

“At least at this moment the content is quite bad but I believe that the service (this kind of service) will
succeed in the future.”

“If the service was made more diverse it could make it much more appealing. At this moment the
content wa
s so narrow that I didn't mind exploring it much. But with good content I could really get
interested.”

“Just a few lousy videos
-

how can anyone talk about diversity here? There was nothing more
interesting for the users here compared to other video servi
ces. ”



Service usability

“It was a bit unclear how to get a video playing in the beginning”

“Badly. Installation as such went fine without problems but getting a video to show up took far too
long. Watching was already interrupted several times during the

first minutes due to hung
-
ups and
other stutters. DEINTERLACING IS MISSING! A true amateur mistake in the broadcast...”


Usefulness of the service

“What is there more advanced in this service than in those commercial Internet
-
channels? The content
of cour
se determines if the service is interesting or not. But still it's good that these kinds of thing are
done. It's also interesting how this kind of service would scale up for larger crowds.”


Overall evaluating


Recommendations for future

“Everything in the

same place, quick search, diverse, support for local download, simple.”

“Multi
-
platform! In addition to computers, should be viewable on various IPTV set
-
top
-
box
solutions.”

“I'd hope for administered entities, clear menu structures, diverse content and e
asiness of taking into
use. A service that both young and old feel comfortable to use. ”


3.1.10.1 Use statistics

19.4.2010
21.4.2010
23.4.2010
25.4.2010
27.4.2010
29.4.2010
01/05/10
03/05/10
05/05/10
09/05/10
11/05/10
13.5.2010
0
5
10
15
20
25
30
0
0
25
21
11
9
5
2
2
15
8
1
1
1
0
2
2
0
1
1
1
0
2
Users per day
Date
No. of users

Figure
20
:

The amount of users per day.





19.4.2010
21.4.2010
23.4.2010
25.4.2010
27.4.2010
29.4.2010
01/05/10
03/05/10
05/05/10
09/05/10
11/05/10
13.5.2010
0.0
20.0
40.0
60.0
80.0
100.0
120.0
140.0
160.0
180.0
0.0
0.0
127.3
59.3
11.0
16.5
18.3
13.6
7.0
157.8
50.0
15.7
45.0
0.4
0.0
64.1
0.4
0.0
0.0
0.2
5.5
0.0
9.6
Total minutes viewed per day
Date
Seconds

Figure
21
:
Total watching

times per day


Mon
Tue
Wed
Thu
Fri
Sat
Sun
0
10
20
30
40
50
60
70
80
0.8
1.3
10.5
10.3
4.0
5.0
2.3
3.4
19.1
71.4
0.7
8.9
30.7
9.4
Average number of users and use minutes per weekday
avg. no. users
avg. no. minutes
Weekday

Fi
gure
22
:The
average number of users and use minutes per day.





19.4.2010
21.4.2010
23.4.2010
25.4.2010
27.4.2010
29.4.2010
01/05/10
03/05/10
05/05/10
09/05/10
11/05/10
13.5.2010
0
5
10
15
20
25
30
0
0
25
21
11
9
5
2
2
15
8
1
1
1
0
2
2
0
1
1
1
0
2
Users per day
Date
No. of users

Figure
23
: Number of users per day

19.4.2010
21.4.2010
23.4.2010
25.4.2010
27.4.2010
29.4.2010
01/05/10
03/05/10
05/05/10
09/05/10
11/05/10
13.5.2010
0.0
20.0
40.0
60.0
80.0
100.0
120.0
140.0
160.0
180.0
0.0
0.0
127.3
59.3
11.0
16.5
18.3
13.6
7.0
157.8
50.0
15.7
45.0
0.4
0.0
64.1
0.4
0.0
0.0
0.2
5.5
0.0
9.6
Total minutes viewed per day
Date
Seconds

Figure
24
:
Total minutes viewed per day.




Mon
Tue
Wed
Thu
Fri
Sat
Sun
0
10
20
30
40
50
60
70
80
0.8
1.3
10.5
10.3
4.0
5.0
2.3
3.4
19.1
71.4
0.7
8.9
30.7
9.4
Average number of users and use minutes per weekday
avg. no. users
avg. no. minutes
Weekday

Figure
25
:
Average number of users and use minutes per day.


3.1.10.
2

Statistics Collection System

The status reporting of SwarmPlugin peers used in VTT test was modified to send event data to a VTT
server. The data was gathered to MySQL database on the server. In

30 minute intervals, a visualisation
software fetched event data from the last 10 days and processed statistics results and plots to be
displayed on a web page:
http://p2pnext
-
statistics.willab.
fi/p2pstat/
.


The visualisation software makes two SQL queries to the database. One returns the maximum and
minimum timestamps for each peer id (different from 'on') found on the last 10 days. Another fetches
the events 'NrOfHelpingPeers', 'DlRate' and 'U
lRate' from the same time interval.



The software makes the following plots from the data:





Peer arrivals per day on the last ten days and per hour on the last 24 hours. These show the
arrivals of SwarmPlugin peers into the system. The arrivals are count
ed from the number of
minimum peer id timestamps occurring that day or hour (these should be timestamps of
'peerid' events). The plots show all arrivals on the time interval, also restarts of peers, that
already have been online (these are not easily disti
nguished, since they always get a new
unique peer id at startup)




Number of SwarmPlugin peers in system over last ten days and over last 24 hours. These plots
are shown on the timestamp accuracy (per second). The minimum and maximum timestamps
for each pe
er id determine the time interval it was online. The number of peers at each second
is then determined by overlapping these time intervals




Average number of helping peers on last ten days. The number is calculated per hour as the
average of 'NrOfHelpingP
eers' event values received during the hour.




Average download and upload rates on last ten days. Calculated per hour as the average of
'DlRate' and 'UlRate' event values received during that hour.




Additionally, an IP
-
address specific display functional
ity of download and upload rates was tested. If
the IP address of the web page user matched with one found from the database, the average 'DlRate'
and 'UlRate' event values for that IP
-
address on last 10 days was also shown as plots. The
identification was

done by comparing the peer id values of 'peerip' events with those of 'DlRate' and
'UlRate' events.



However, it was found out, that the IP address sent by the peer in 'peerip' events was not always the
one it actually used in the swarm. Python on Window
s seems to fetch the address of the first network
interface found from the system, not necessarily the right one.



More obvious difficulty is a peer behind NAT, which naturally sends its private address to the server
and not the global one used by the NAT

server.



Also, a user getting global IP addresses from DCHP server does not necessarily have any more the
same address that was used in the test.



Because of these difficulties, the IP
-
specific display was discarded.

3
.1.10.
3

Statistics from 15 April to

15 May

The following plots are drawn by the visualisation software, as described above. However, instead of
ten days, they show the span of the most active testing period, from 15th April to 15th May:



Figure
26
:
P2P
-
Next statist
ics web page









Figure
27
:
Arrivals of SwarmPlugin peers per day

All SwarmPlugin peer arrivals into system during the time span:
380



Figure
28
:
SwarmPlugin peers in system



Average number of SwarmP
lugin peers in system over the time span:
0.057



Average time of SwarmPlugin peers online:
6 m, 29 s



Maximum time of a SwarmPlugin peer online:
3 h, 8 m, 36 s








Figure
29
:
Average number of helping peers


Maximum number of
helping peers in a 'NrOfHelpingPeers' message:
7



(The number of helping peers includes also non
-
SwarmPlugin peers, that participate into the swarm,
but do not report event data. E.g. caches and the seeder were normal BitTorrent clients)


Figure
30
:
Average download rates





Figure
31
:
Average upload rates


Average from values of all 'UlRate' messages received:
5.0 kBytes/s

(
40 kbit/s
)



Maximum upload rate in a 'UlRate' message:
467.0 kBytes/s

(
3.7 Mbit/
s
)





3.1.11 Video Quality Trial

The effect of decreasing network quality on NextShare live streaming video quality was measured in
order to determine the network parameter limits for tolerable video quality. Network quality was
decreased in Network emula
tor (Netem) by altering one parameter (bandwidth / delay / jitter) until the