- No upcoming events available
« March 2024 »
Sun | Mon | Tue | Wed | Thu | Fri | Sat |
| | | | | 1 | 2 |
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
31 | | | | | | |
|
Blogs
Presentation
"... the only way to deal with law in cyberspace is to ignore it, wildly, flagrantly...."
John Perry Barlow, 2001, P2P Conference
These people are lunatics, who would destroy the future of free expression and technological development, so they could sit in easy chairs at the top of the smoking ruins and light their cigars off 'em. - John Gilmore, Thu, 21 Dec 2000
a. Break down the rights of the exclusive grant under the 1976 Act.
Retaining Performance, display, synchronization rights.
b. Background of Hardin and Olson.
Points of departure
The Problem of Collective Action, Harvard, 1965
The Tragedy of the Commons, Science, 1968
Olson analyses the difficulties collective organisation in the context of the attempts to acquire exclusive/ inclusive benefits. Concludes that it is easier to appropriate exclusive benefits. The difficulty in acquiring optimal amounts of inclusive goods (those of a public good type) derives from free-riding.
The seminal description of the public goods problem is Mancur Olson’s “The Logic of Collective Action: Public Goods and the Theory of Groups”, Harvard University Press, 1971 at pages 14-15: "A common. Collective, or public good is here defined as any good such that if person X in a group x…..y….z consumes it, it cannot feasibly be withheld from others in that group. In other words, those who do not purchase or pay for any of the public or collective good cannot be excluded or kept from sharing in the consumption of the good, as they can where noncollective goods are concerned." National defence is often offered as an example. In economic terms this is known as non-excludability.
See Olson, op. cit. at p.21: " Though all the members of the group therefore have a common interest in obtaining this collective benefit, they have no common interest in paying the cost of providing that collective good. Each would prefer that the others pay the entire cost, and ordinarily would get any benefit provided whether he had borne part of the cost or not." According to Olson, this implies that there will be sub-optimal production of collective goods to a degree determined by the size of the group: the larger the group, the greater the shortfall.
Hardin examines world population growth through the optic of sustainability:
“The tragedy of the commons develops in this way. Picture a pasture open to all. It is to be expected that each herdsman will try to keep as many cattle as possible on the commons. Such an arrangement may work reasonably satisfactorily for centuries because tribal wars, poaching, and disease keep the numbers of both man and beast well below the carrying capacity of the land. Finally, however, comes the day of reckoning, that is, the day when the long-desired goal of social stability becomes a reality. At this point, the inherent logic of the commons remorselessly generates tragedy.
As a rational being, each herdsman seeks to maximize his gain. Explicitly or implicitly, more or less consciously, he asks, "What is the utility to me of adding one more animal to my herd?" This utility has one negative and one positive component.
1. The positive component is a function of the increment of one animal. Since the herdsman receives all the proceeds from the sale of the additional animal, the positive utility is nearly + 1.
2. The negative component is a function of the additional overgrazing created by one more animal. Since, however, the effects of overgrazing are shared by all the herdsmen, the negative utility for any particular decision making herdsman is only a fraction of - 1.
Adding together the component partial utilities, the rational herdsman concludes that the only sensible course for him to pursue is to add another animal to his herd. And another.... But this is the conclusion reached by each and every rational herdsman sharing a commons. Therein is the tragedy. Each man is locked into a system that compels him to increase his herd without limit -- in a world that is limited. Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons. Freedom in a commons brings ruin to all.”
The first ripost to Hardin’s thesis as applied to software – which is how I will refer to entertainment/informational goods for the remainder of the presentation – is that as the have the character of public goods, the premise that overconsumption leads to exhaustion is negated.
See Rose: individual property and commons are axes that predetermine our response to resource allocation problems – she redefines commons as open access, as opposed to collective property or Common Property Resources. History. Narrative.
c. Breyer/Tynerman debate.
Early History of Scepticism
Breyer: 1970
“ …. Copyright law… may represent one way of resolving the conflict between the need for book revenues high enough to secure adequate production and book prices low enough not to interfere with widespread dissemination of what is written… A copyright system however, is not the only way to resolve this conflict. It would be possible for instance, to do without copyright, relying upon authors, publishers, and buyers to work out arrangements among themselves that would provide books’ creators with enough money to produce them.”
lead-time
the threat of retaliation: fighting edition
rationality in producing pirated versions of low cost editions
reputation
quality
extant distribution channels
Solutions: Alternative revenue Streams
“Buyers, individually or in groups, might contract to buy the book in advance of publication, before copying is possible.”
Problems:
Administrative costs.
Delegation of choice to the tyrants of taste
Blind man dilemma of buyer who is financing a book whose contents are unknown.
Strategic behaviour on the part of members of the group undervaluing the work. Will pose itself as a problem specifically where there is a shortfall on the embargoes amount.
Benefits of Abolition
Lower prices
Removal of barriers to entry
Hazards/Costs of Abolition
Threat to high cost, large volume book.
Demoralisation costs of institutional change are difficult to assess.
Policy Recommendations
Duration is too long and should not be extended
Private copying should be explicitly permitted
Computer storage of works should be allowed without copyright owner’s permission.
No © for computer programs.
Focus
Speed of competitive response and its relationship with price
Buyer self-organisation
Alternative subsidies
Need to tailor analysis when focusing on different types of writings
Need for more empirical info prior to revision
Tyerman
Breyer’s article was cited favorably by Douglas J in a 1971 dissent Lee v Runge 1972 92 S. Ct.
Cf. Breyer’s claims that tort law could look after the rights of patenity, integrity and first publication.
1. Tynerman argues that in the absence of legal protection, marginal works of doubtful profitability will not be published due to fears of being undercut in the market by a competitor.
2. The ‘lead time myth’ 1108 on royalties to British authors.
Larger sunk costs require greater period of time to recoup; not like the small economies of the 19th century. Breyer reckoned on the initial publisher having a 6-8 week advantage, the author contests this, arguing that technological change makes creation of such imitations easier and swifter. Probably true, DTP, digital e-book etc, even more so.
ii. Breyer claims that people are price insensitive when it comes to book purchasing and will refuse to wait until a cheap edition comes out.
iii. Says distributors, who determine the structure of the retail market, would be happy to wait a couple of days given the higher margin they could receive from pirate editions.
Says that what lead time advantage there is doesn’t work in markets where the time required to recoup fixed costs is longer, e.g. text books, less popular trade paperbacks.
3. The fear of retaliation
Retaliatory fighting editions were issued only of British uncopyrighted works, it was an instrument to be used occasionally. In Breyer’s world, this fear would have to play a disciplinary role every time. Used repititively, it would only be a matter of time before the publisher became financially insolvent.
4. Contradicts use of government reprints and puiblic domain titles to claim that publishers and copiers of trade paperbacks can co-exist. SDays that in both cases there is no author to pay, and no advantage in terms of first copy costs for the ‘copier’.
Casts doubt on the profitability of 19th c. works by british authors, says maybe it was just a marketing draw, capitalised upon through sale of copyright works by US authors.
5. Transaction costs of group organisation – eroded by technology
Delegation of purchasing authority and concomitant blurring of individual preferences.
The unseen work problem.
Freeloaders
6. State financing alternative is rejected because it could entail utilitarian selection or censorship.
7. Criticises Breyer’s neglect of authors, trots out the Marci claim about financial independence, and Neil’s about countering what would otherwise be an elitist culture. Copyright allows the spreading of risk.
8. Says book prices would rise if dependent on lead time and that the smaller volume produced would lead to higher process as the costs can’t be spread over as many copies. Says the high price of trade paperbacks and text books is because they are x-subsidising agents.
9. Says that permission acquisition will require more stringent rules as it plays a larger role in income stream.
10. Identifies a clash in interests between publiusher and author, one wants to maximise sales and the other profits. These things do n0ot necessarily converge. Think advertising costs.
11. What about revenue from subsidiary rights?
12. Rejects Breyer’s contestability argument and states that loss of © protection will disproportionately impact upon smaller houses with shallower pockets.
Breyer Rejoinder
1. There are book clubs and there are institutional instances of mass purchase.
2. There are plenty of subsidies already.
3. Critical of William & Wilkins decision, demands exemption. Cull the transaction costs.
4 Textbook market is concentrated. Trade market is not, high profits indicates presence of market power, no pricing discipline.
5. Collective licensing organisations spend loads on collecting fees (ASCAP 1971 spent $9m to collect $49m)
On digital storage: 81
“As long as economies of scale are such that only one information system would serve an area, the need to contract with the author for the storage of his work should prove to be a sufficientsafeguard for his securing adequate compensation.”
Predicts difficulties of co-existence of data and hard copy world.
Reiterates that it’s about how much protection is required.
What forces sustain production in the absence of ©?
What other forces inhibit competition making recoupment impossible?
What does the government pay?
Are buyers likely to find other channels to direct the funds to producers.
Does © drive up prices? Does this diminish circulation?
Do transaction costs for permissions impede circulation?
Can © be used to inhibit competition within the industry?
d. William Kingston: Direct Protection of Innovation.
William Kingston's work centers on direct support for innovation. His primary field of concern is the patent system although he argues that his central arguments are applicable to copyright law.
Kingston identifies three areas for reform in existing law:
1. The nature of the enforcement process, which is long, unwieldy and expensive, intrinsically disadvantages smaller firms in conflicts with deep pocket rivals, and even where litigation may promise good chance of success, is to be eschewed where possible, as it necessarily diverts the principals - who in small companies are often the carriers of important informational capital - distracted from innovation by the courtroom, a move which will disproportionately impact upon their wealth in comparison with a corporate bigfoot which retains fleets of in house counsel and litigators.
- The parallels with copyright are not difficult to observe, particularly in court cases that match the owners of copyright inventories against individual plaintiffs and small entertainment/cultural/academic outfits. Think of producers of derivative works as the incremental improvers or combination patentees of the © world.
2. Kingston's second claim is that it is insanity to maintain a system of monopoly defined by length of time, rather than money, as this serves to grant - if one accepts the incentive theory of creation/innovation at least - an exclusivity which is inefficient due to the attempt to make one size fit all. Levels of risk fluctuate wildly from industry to industry, and more pertinently have different relationships between inputs and outputs. Pharmaceuticals and aerospace are a good example of this. The former have high R&D/initial manufacture costs but low costs of reproduction. The latter has market entry costs - in terms of the investments required to set up production - so formidable as to deter would be free riders.
- Likewise, different types of copyright works have vastly different cost structures. Movies and books have radically different complexities of production and risk exposure.
3. Diffusion: Kingston's last and most intriguing proposal is the replacement of property regime character of informational law by a system that combines compulsory licenses and capital payments to the innovator. The level of payments would be determined by the time of entry of the second party - a correlation with risk - based upon accepted pre-production accounting - to simulate the return that would be derived from a fortuitous outcome to the innovation and the exclusivity flowing therefrom. The shift to liability means that strategies predicated on extracting a rent-level price through hold-out will be defeated or mitigated (gives example of Motorola's flat refusal to license crucial elements of mobile phone technology to Ericsson, particularly relevant to complex technology industries where saturation patenting is common). For Kingston this is an especially important point as he argues that the most valuable innovations are incremental and companies compete to capture share in new markets and expand the scope of application of an invention.
Kingston doesn't like ex post valuations - susceptible to fraud - and wants an ex ante system of accounting. On the basis of these accounts, competitors would be allowed to buy in to share the information produced by an innovator in return for a retrospective sharing in the investment and the risk in that investment - ie that there must be a return multiple to reward the innovator. Capital payment for a sunk cost; royalties would predicate income on ultimate success which would not be fair.
Variables for determining capital sum:
a. Needs and risks associated with a given industry
b. Stage of development of the innovation. e.g. 8(i) at prototype stage, 5(i2) at product stage etc.
c. To what degree should the portfolio aspect of R&D be allowed to enter into the capital determination is unclear.
d. What duration of protection should the originato & licensees receive?
e. Mitigates harmful delays in cumulative technology industries such as aerospace and semi-conductors where progress is impeded until formation of a patent pool, but also disciplines the anti-competitive effects of patent pools.
Problems with Kingston's proposal center on the temptation of participants to cheat in the accounting process, the hiding of costs, the inflation of costs; if these are to be the variables which determine the buy-in price.
Secondly Kingston's proposal implicitly institutionalizes a regime where the scale of R&D finance forms the basis of the reward. This will undervalue cheap but ingenious devices and innovations. This returns us to the question as to whether a period of exclusivity should be determined on the basis of time or money. If it is to be the latter, how shall the sum be calculated. Perhaps K's plan could be partially synthesized wit that of Shavell as outline below, where the reward is indexed to the popularity of the product as measured in sales.
e. Steven Shavell/Tanguy Van Ypersele: Rewards versus intellectual property rights & the State.
Tanguy and Shavell make a critical comparison between the incentive provided by IP rights and rewards directly paid to the creator/innovator by the state. The authors dwell on the difficulties of government setting of reward levels due to informational deficiencies but suggest that the amount to be calculated on the basis of the popularity or utility of the product as reflected in sales. The products themselves would pass immediately into the public domain. They conclude that an optional reward/IP regime would be superior to the current form of regulation.
The authors identify the waves of opposition to the patent monopoly that rolled over the 18th and 19th century and the institutional alternatives considered. They note that direct rewards was a system used, with 'eight acts awarding rewards for specific inventions passed by Parliament between 1750 and 1825 (including $30,000 for a small pox vaccine collected by Edward Jenner (p.2)), as well as numerous others provided by industry groups and other organizations. The resonance with the Street Performer Protocols proposal for escrow-based markets is striking.
Deviations from 'first-best'.
Without patents, dissemination would occur at marginal cost and there is no deadweight loss. Incentives to invest in research 'is imperfect under both systems, but in different ways.' (p.6).
Essentially the authors argue that depending on the size of the surplus created, rewards and patents may both serve to over or under incentify R&D investment. Neither form can overcome this informational deficiency. They do argue that an optional reward/patent system will always be superior to the patent system.
1. The race to innovate first, and the waste this generates, continues under either system.
2. Subsequent innovations are facilitated through the fact that the first inventor cannot block improvements through withholding the license.
2a. The value of subsequent incremental invention will complicate the government's job in pinning down its value.
3. Would have to be financed through taxation, probably on income rather than commodities, as this method minimizes the dead weight loss relative to the available alternatives. They remark that taxation on goods functions effectively in the same was as an IPR.
4. A shift to an optional reward/patent system should be frictionless as it can only increase the profits of the companies affected by it.
5. The authors are particularly attracted to the reward system where the social losses from the existing IPR regime are high (i.e. a large gap between price and production cost): pharmaceuticals, software, music, movies.
Rewards to be based on sales, perhaps reassessed annually.
Key weakness of the system is its dependence on government knowledge of the social value of innovations. The importance of this in setting rewards ex-ante is clear, but if, as the authors propose, the government should use sales data to assess value other problems arise. Firstly it would require a massive concentration of information in government hands. There may be no malign possibility in the context of the take up of mechanisms/devices, but the dangers of encroachment and the temptation it wpould offer in terms of informational works is clear. Secondly it would encourage various forms of falsification of data by entities motivated to inflate the size of their sales figures in order to increase their rewards.
Check out blogger
f. Bruce Schneier & John Kelsey: The Street Performer Protocol. Escrowing our way out of the tragedy of the commons, without the need for a patron, or rather with a truly public patron - preference-mongers satisfied.
Current © schemes driven by the industry can de defined as
a) 'Secure perimeter' schemes (encrypted containers/trusted systems).
b) 'Traitor tracing' schemes (embedded purchaser information into the digital content allowing identification and later prosecution of the person responsible for the leak).
Escrow-based markets. In the schema Schneier describes works are financed through a form of mass public patronage. The creator of a work establishes a bank account with a trusted third party. A threshold cash requirement is set. When the amount of money in the account equals or exceeds the threshold, the work is released to the public. Escrow holds the work hostage in a sense, and the artist is guaranteed a basic payment for her work.
Problems identified by the authors:
1. The author can charge an inappropriate price. He and other authors will presumably learn from their early mistakes, and become skilled at choosing appropriate prices.
2. The author publishes the book before he gets the requested amount donated. This doesn't appear to hurt anyone directly except the author, but it may undermine participation in this kind of scheme later, especially in schemes run by this author later.
3. The author gets his amount filled, but still doesn't publish the book. This will ruin his reputation for future deals of this kind, but that is only a concern if he has already built up a reputation, and if he intends to publish future books. It is here that we can see how to use cryptography and a trusted third party to make the whole system work.
First two issues are characterised as market problems that can remedy themselves over time. Third problem is one of trust, which is where the SPP is introduced.
Why donate?
- A donor may give money partly out of the desire to be recognized as a generous person or a patron of the arts.
- There may be additional premiums involved in donating; raffles for a lunch with the author, for example.
- A donor may be more likely to give money when he can see that it has an immediate effect. Thus, public radio stations have goals for pledge drives, and also for specific times. This might translate into letting novels appear in small fragments, as small additional goals are met. Experience in the market will determine what pricing and marketing strategies work best.
Process:
*Submission of Work to Publisher: The Author and Publisher negotiate terms, based on how much the next chapter (or several chapters) will cost to get released, and how the money collected will be split between the Author and the Publisher
*Gathering Donations: The Donor sends $N in donations, plus some unique identifier to specify where any refunds should go. Donors who wish to remain anonymous may identify either an anonymous account (where funds could eventually find their way back to the Donor), or some charity or other beneficiary of their choice.
*Paying Back the Donors: If the promised work is not released by the specified date, then the Donors' signed documents can be used to collect money from the Publisher.
" The Street Performer Protocol is effectively a means of collecting private funds for public works. It allows for all kinds of alternative public creative (literary, music, video) works. It can be used to improve public-domain software. Software developers could announce a price structure to add various features to an existing public-domain software package, and users could pay for the features they want. When sufficient interest generates the right amount of funds for a given feature, it is created. This Protocol could pay for Web sites; contributions mean that a given Web site will continue to be maintained and improved."
This proposal has been the subject of much criticism, and the principal object of the paper is to tease out the problems identified. Theoretical objections are made, built upon Olson’s work and its derivatives, ‘The Tragedy of the Commons’ and ‘The Prisoner’s Dilemma’.
Experiments with escrow-markets are ongoing in the open source field, notably HYPERLINK "http://www.cosource.com/" http://www.cosource.com/ (344 Requests with total interest of $196,974.00, 54 Proposals with total commitments of $25,884.9043 Leading proposals with commitments of $25,206.95.00) and HYPERLINK "http://www.SourceXChange.com" www.SourceXChange.com (Registered Users: 10399, Active Developers: 2691, $16,000 estimated transacted biz at end of may 2000(revise)).
Graydon Hoare has argued however that the bounty is ill suited to capitalize on the best and most dynamic aspects of open source culture, viz. its openness, willingness to share, many eyes making shallow bugs. He says that the really smart programmers won't be interested anyway, they can find their own tasty projects to work on. The bounty will have to be monopolized or shared between a small number of people.
1. It's not the community model.
2. It's not the inspired developer model.
3. It's not the annoyed scratchy genius model.
4. It's not the market model.
5. It's technically difficult. Extraneous technicalities are overburdensome for contracts which are so small and fundamentally uninteresting.
I think he may be spot on, as soon as money is involved for some, it pollutes the relationships with those on the peripherary who remain non-monetized. Everyone is paid or no-one is paid.
In somewhat different form, Stephen King’s excursion into online publishing which has so far turned a profit of $463,832.27
HYPERLINK "http://www.stephenking.com/PlantNumbers_010101.html" http://www.stephenking.com/PlantNumbers_010101.html
g. Voluntary Payments & Outstanding Difficulties: Listener/Viewer Support, Paypal, Amazon, Tipster and the continuing absence of Micropayments.
Credit card companies charge recipients at a rate that still makes micropayments unfeasible (3.5% of total plus $0.23). Into the gap has come Paypal which allows minimum transactions as low as $.01and more recently amazon.com with their ‘honor system’ of payment also starting at $1.00 (cost: $0.15 plus 15%).
Paypal: PayPal had competitors and imitators, but only one -- X.com -- had significant market share. The two companies merged last March, taking the X.com corporate name and the PayPal product name. The combined company has more than 80 percent market share.
Paypal Donate:
HYPERLINK "http://www.potlatch.net/" http://www.potlatch.net/
HYPERLINK "http://www.fairtunes.com" www.fairtunes.com see their statistics including a list of last ten donors: http://www.fairtunes.com/stats/
Total number of contributions:1248Total US contributions:$7790.14USTotal CA contributions:$632.08CA
Pacifica raised $7,652,543 in listener support out of total revenues of $10,733,430 during the fiscal year 1999.
In addition to listener support, the Pacifica network received $1,356,818 in federal funding from the Corporation for Public Broadcasting.
CPB Appropriations 1990 – 2000:
Year1991199219931994199519961997199819992000Amount
(Millions)229.4298.9327.3318.6275285.6275.0260.0250.0300.0
More Good News: Listener Support as percentage of total revenues for PB has risen dramatically in the last two decades:
SourceTotal Income
(Thousands)CPBOther federalLocal Gov.State GovPub CollegesPrivate CollegesFoundationsBusinessSubscribersAuctions/ Other 1982845,214172,000
(20.3%)25,625
(3%)42,353
(5%)166,515
(19.7%)92,170
(10.9%)12,870
(1.5%)22,108
(2.6%)100,486
(11.9%)142,076
(16.8%)69,011
(8.2%)19971,932,260260,000
(13.5%)62,271
(3.2%)66,087
(3.4%)298,834
(15.5%)177,951
(9.3%)35,206
(1.8%)111,570
(5.8%)277,576
(14.4%)472,040
(24.4%)170,725
(8.8%)
Furthermore the growth in the size of the contribution made by each donor has greatly outstripped inflation:
Membership income as percentage of total cash income19801998Public radio15%35%Public Television15%24%Number of individual contributors19801998Public radio.5 million2.1 millionPublic television2.5 million4.7 millionAverage per-person contribution in current dollars19801998Public radio$24.84$70.47Public television$30.12$73.30Source: Corporation for Public Broadcasting
McChesney's invocation of Dean Baker's proposal that Federal Taxation laws be changed so as to allow each citizen to dedicate $150 to the non-profit medium of their choice deductible from the individual tax bill. Public subsidy without the hazards or inefficiencies of government administration.
h. Freeriding and Gnutella: The Return of the Tragedy of the Commons: Bandwidth, crisis of P2P, tragedy of the commons, Napster's coming difficulty with a business plan and Mojo Karma. Doing things the freenet way.
Eyton Adar & Bernardo Huberman (2000)
Hypothesis 1: A significant portion of Gnutella peers are free riders.
Hypothesis 2: Free riders are distributed evenly across different domains (and by speed of their network connections).
Hypothesis 3: Peers that provide files for download are not necessarily those from which files are downloaded.
" In a general social dilemma, a group of people attempts to utilize a common good in the absence of central authority. In the case of a system like Gnutella, one common good is the provision of a very large library of files, music and other documents to the user community. Another might be the shared bandwidth in the system. The dilemma for each individual is then to either contribute to the common good, or to shirk and free ride on the work of others.
Since files on Gnutella are treated like a public good and the users are not charged in proportion to their use, it appears rational for people to download music files without contributing by making their own files accessible to other users. Because every individual can reason this way and free ride on the efforts of others, the whole system's performance can degrade considerably, which makes everyone worse off - the tragedy of the digital commons ."
Figure 1 illustrates the number of files shared by each of the 33,335 peers we counted in our measurement. The sites are rank ordered (i.e. sorted by the number of files they offer) from left to right. These results indicate that 22,084, or approximately 66%, of the peers share no files, and that 24,347 or 73% share ten or less files.
The top Share As percent of the whole 333 hosts (1%) 1,142,645 37% 1,667 hosts (5%)2,182,08770%3,334 hosts (10%) 2,692,082 87% 5,000 hosts (15%)2,928,90594%6,667 hosts (20%)3,037,23298%8,333 hosts (25%)3,082,57299%Table 1
And providing files actually downloaded?
Again, we measured a considerable amount of free riding on the Gnutella network. Out of the sample set, 7,349 peers, or approximately 63%, never provided a query response. These were hosts that in theory had files to share but never responded to queries (most likely because they didn't provide "desirable" files).
Figure 2 illustrates the data by depicting the rank ordering of these sites versus the number of query responses each host provided. We again see a rapid decline in the responses as a function of the rank, indicating that very few sites do the bulk of the work. Of the 11,585 sharing hosts the top 1 percent of sites provides nearly 47% of all answers, and the top 25 percent provide 98%.
Quality?
We found the degree to which queries are concentrated through a separate set of experiments in which we recorded a set of 202,509 Gnutella queries. The top 1 percent of those queries accounted for 37% of the total queries on the Gnutella network. The top 25 percent account for over 75% of the total queries. In reality these values are even higher due to the equivalence of queries ("britney spears" vs. "spears britney").
Tragedy?
First, peers that provide files are set to only handle some limited number of connections for file download. This limit can essentially be considered a bandwidth limitation of the hosts. Now imagine that there are only a few hosts that provide responses to most file requests (as was illustrated in the results section). As the connections to these peers is limited they will rapidly become saturated and remain so, thus preventing the bulk of the population from retrieving content from them.
A second way in which quality of service degrades is through the impact of additional hosts on the search horizon. The search horizon is the farthest set of hosts reachable by a search request. For example, with a time-to-live of five, search messages will reach at most peers that are five hops away. Any host that is six hops away is unreachable and therefore outside the horizon. As the number of peers in Gnutella increases more and more hosts are pushed outside the search horizon and files held by those hosts become beyond reach.
Easily isolated providers are set up for litigation by the RIAA etc.
Solutions?
i. In the "old days" of the modem-based bulletin board services (BBS), users were required to upload files to the bulletin board before they were able to download.
ii. FreeNet, for example, forces caching of downloaded files in various hosts. This allows for replication of data in the network forcing those who are on the network to provide shared files.
iii. Another possible solution to this problem is the transformation of what is effectively a public good into a private one. This can be accomplished by setting up a market-based architecture that allows peers to buy and sell computer processing resources, very much in the spirit in which Spawn was created
i. If granularity is the answer then empirical data is going to be required to get us there. Normative choices must be made in an informed environment, not all decisions need rigorously adhere to the indications of empiricism, but where there is a conflict this should be transparent and the social grounds for choosing an alternative made clear.
Building a Practical Model: The Movie Industry
Disney
With regard to home video, DVD is beginning to breathe new life into this market. The growth in sales of DVD players has exceeded the growth of CD players at this stage. When I wrote you a year ago, there were 9.7 million units in American households. Now, there are an estimated 22 million units. And, at the end of 2001, it is forecast that there will be more than 36 million units. This trend is especially important to our company because more and more people are not just buying DVDs of new movies, they are also buying DVDs of movies they already own on VHS. For example, someone who already has Pinocchio in his library might now buy Pinocchio on DVD because of the added quality and extra features. Indeed, a survey conducted earlier this year indicated that 14 percent of people who own DVD players said they are likely to replace all of their VHS movies … while 55 percent said they are likely to replace their Disney videos. When it comes to home entertainment, there is a Disney difference, and consumers know it.
Changing Revenue Sources in the Motion Picture Industry 1980 - 1995
WindowTheater DTheater FHome VideoPay CableNetwork TVSyndicationForeign TVMade for TV198029.6%22.8%7.0%6.0%10.8%3.8%2.5%17.5%199514.4%12.8%40.6%7.8%1.4%4.2%6.7%12.2%
Source: Entertainment Industry Economics: A Guide for Financial Analysis, 1998.
HYPERLINK "http://www.venge.net/graydon/bounty.html" http://www.venge.net/graydon/bounty.html
HYPERLINK "http://slashdot.org/articles/99/06/10/1529204.shtml" http://slashdot.org/articles/99/06/10/1529204.shtml on Bounty
http://www.artcom.de/AxelZerdick/limits/25-FF-Movie.pdf
http://www.uwindsor.ca:7000/comm.stud/gold/film/outlinee.htm
Other Problems
Comment: The FSF and Open Source provide the factual base on which Moglen's argument is built. The development of Graphic User Interfaces such as Gnome and Photoshop substitutes suggest that open source is moving towards a point where its adoption by the mass market becomes viable. The advantages for system administrators etc. are already manifest in the success of Apache/Perl etc.
The fact that multifarious forms of culture can now be rendered as bits, and are thus subject to the law of the bit, leaves a lot of other questions about cultural production unanswered. What type of information is produced in an environment where legal regulation is ineffective? Who will control it?
The production of literature or song is not as amenable to the form of collaborative construction as is the case with modularized software. There are examples, but they remain the exception, and in most cases are initiated through practical co-operation in real-space rather than contact through news groups.
1. Entertainment companies can port many of the factors that give them an advantage to the net. If we abolish copyright altogether, then producers of independent works will have no recourse if their work is appropriated by the commercial behemoths. There are two solutions to this problem. One is the retention of the right to public performance, on a copyright or compulsory license model that would oblige remuneration of the producer.
The other is to stick with the GPL model and reserve all commercial implementations within the fabric of copyright law.
2. The point of hastening the demise of the media industry is to improve the quality of the information environment. As above a necessary precondition to this is the enabling of independent producers, who have mouths to feed and wains to clothe, and beer to drink. They must be remunerated, not as a matter of incentive but as a matter of survival. The alternate is a return to patronage along one of two lines. The first is patronage through individuals and corporations; this is unwelcome because it regresses the social context of the production to a date prior to the French revolution, of which Eben writes rather fondly. The second is a system of public patronage through either (a) the state or (b) the public tout court, in unmediated fashion. The former is undesirable for all the reasons we associate with the first amendment, the boredom and wastefulness of time involved in filling in grant applications and the example of the NEA. The latter is desirable and possible - Bruce Schneier has outlined a protocol roughly fitted to the task - but problems remain.
The SPP can be characterized as either a system of public patronage or an embargo system working on an escrow-based market, according to the sensibility of the speaker.
The principal problem is that of free-riding, or as it is sometimes referred to, 'strategic behavior' on the part of individuals. In essence the claim is that if the production of works is contingent upon receipt on a given level of donations or pledges from the public, individuals will decide to wait in the expectation that others will pay for the works which they want to get their mitts on. The result is that there is underproduction of works. This argument could also be called the apologia for force argument, in that it supports copyright law's imposition of an authoritarian obligation to pay.
One solution would be to have a list of donors displayed on the site containing the original copy of the embargoed work, thus providing a self-congratulatory factor. Hey, we might even get a useful transposition of the first post irritant that has long plagued Slashdot! First donor!
A similar problem is revealed in the PARC Xerox survey of Gnutella users that found that even where cost of participation was low, 70% of the content available derived from just 5% of the active servers. Most users did not contribute anything at all. Could insert an upload download ratio, to remind participants, don't even try to make it foolproof, just make it a little bit inconvenient.
Another issue, although much less compelling, asks why users would pay for the production of something ex ante, whose content and quality they are not familiar with. This claim supports publishers' role as the financiers and public marketers of works. These objections are not compelling because today when we buy a CD we are usually acquainted with a portion of its contents. The rest we purchase at our own risk. Thus there is actually little difference between the two set-ups. Publishing intermediaries have traditionally played a role as filters evaluating content, and peer groups and artists portals are more than capable of fulfilling the same role, and doing it better. Then there are endorsements by known individuals, like with books these days. There'd be more room for reviewers. There'd be more room for small publishing operations where the imbalance in bargaining power would not be so drastic.
2. Strategic behavior problems can be expected to take their toll in an initial period. Insufficient demand, as expressed in lack of greenbacks, will drive down supply. If the supply drops below this demand, the people who freeload will have to pay up or go without. That is a question of time. It allows also the reformulation of the question what is free content. Is advertising subsidized content free? Does our attention have a price? Of course it does, that is why companies spend up to 25 % of their revenues on trying to acquire and hold it.
3. One objection that requires more consideration is the claim that barriers to entry for new artists will be higher if popularity is related purely to reputation. This is another surreptitious claim to the benefits of the publishers' role, namely to invest and risk-take in new work. But the reason why such risk taking is necessary is the extent of the barriers comprised of marketing and distribution costs. Marketing has many unpleasant spillover effects, like a media environment where our taste (as determined by the array of options proposed) is fashioned externally by economic agents.
4. Chicago objectionists claim that the system is doomed by the impossibility of pricing ex ante. Thus the price will be set either too high, leading to a failure to garner enough donations to produce the work, or too low implying that the artist does not receive her due given the popularity of the work. In first scenario, pricing corrections will be possible. In the latter, pricing correction will be applied to the next release. Tipping also offers a realistic, although admittedly partial, remedy here.
Another slashdotter suggested
"How about this as a way of calibrating "runaway hits": artist announces a new song is available, it will be released in one year -- and each $100 contributed will move up the release date by one day. (Set the dollars-per-day amount higher or lower depending on the artist's popularity.)
The song eventually gets released, even if no one donates a dime; but if enough people are interested and contribute, the song could be released almost immediately, and the artist will instantly be $36,500 richer.
Alternately, the song is to be released in six months, but a contribution total of $1 knocks one week off the release, $2 for two weeks, $4 for three weeks ... each additional doubling knocks off another week. This makes it theoretically possible for the artist to earn up to $32M (not bloody likely for just one song), and eliminates the need to calibrate the pricing to the artist."
5. What happens to the money if the threshold isn't reached?
6. Get stats on investigative reporting from Ed Baker
7. Lots of suggestions on the tipping variety.
8. A reinvigorated concept of the physical object.
5. Stats:
In five days, 120,000 people downloaded the first chapter, 76% made a voluntary contribution of $1.00.
The 'Street Pusher Protocol' pay me and I'll give you more of the same.
How much of the current cost of a CD goes to the artist?
How much of the current book price goes to the artist?
How many full time writers musicians, actors etc are there earning more than 50,000 p/a?
Advance orders for new books, what do they total?
Magazine subscriptions what do they total?
What device could be designed to ensure that pledged money is used in the period prior to release to lubricate the process - money management etc.
J o h n G i l m o r e , T h u , 2 1 D e c 2 0 0 0 G e t t i n g g r i t t y a n d g r a n u l a r .
t o S o c i a l D i l e m m a s :
( r e d u c e d b y n e t ) ( e l i m i n a t e d b y S P P ) ( a c o n s t a n t , p r e s e n t a l s o i n ( c ) s y s t e m t o d a y ) . c o u l d l o o k a f t e r t h e r i g h t s o f r , ( ) o n r o y a l t y p a y m e n t s
h t t p : / / w w w . c o s o u r c e . c o m / àÉêyùºÎ Œ‚ ª K©
2 h t t p : / / w w w . c o s o u r c e . c o m / Ý D ÐÉêyùºÎ Œ‚ ª K©
w w w . S o u r c e X C h a n g e . c o m àÉêyùºÎ Œ‚ ª K©
< h t t p : / / w w w . s o u r c e x c h a n g e . c o m / E D ÐÉêyùºÎ Œ‚ ª K©
4 h t t p : / / w w w . s t e p h e n k i n g . c o m / P l a n t N u m b e r s _ 0 1 0 1 0 1 . h t m l àÉêyùºÎ Œ‚ ª K©
h h t t p : / / w w w . s t e p h e n k i n g . c o m / P l a n t N u m b e r s _ 0 1 0 1 0 1 . h t m l Ù D ÐÉêyùºÎ Œ‚ ª K©
h t t p : / / w w w . p o t l a t c h . n e t / àÉêyùºÎ Œ‚ ª K©
2 h t t p : / / w w w . p o t l a t c h . n e t / Í D ÐÉêyùºÎ Œ‚ ª K©
w w w . f a i r t u n e s . c o m àÉêyùºÎ Œ‚ ª K©
4 h t t p : / / w w w . f a i r t u n e s . c o m / ) $ $ If –l Ö h t t p : / / w w w . v e n g e . n e t / g r a y d o n / b o u n t y . h t m l àÉêyùºÎ Œ‚ ª K©
R h t t p : / / w w w . v e n g e . n e t / g r a y d o n / b o u n t y . h t m l E D ÐÉêyùºÎ Œ‚ ª K©
4 h t t p : / / s l a s h d o t . o r g / a r t i c l e s / 9 9 / 0 6 / 1 0 / 1 5 2 9 2 0 4 . s h t m l àÉêyùºÎ Œ‚ ª K©
h h t t p : / / s l a s h d o t . o r g / a r t i c l e s / 9 9 / 0 6 / 1 0 / 1 5 2 9 2 0 4 . s h t m l
t h a t
c o m m o n s
Q u e s t i o n s
5 . 1 0 U . S . S t u d i o R e v e n u e s f r o m F e a t u r e F i l m s , 1 9 9 4 - 1 9 9 6 i n t e - g r a t i o n
o f d i f f e r e n t d i s t r i b u t i o n a l s y s t e m s c h a n g e d t h e r e v e n u e r u l e s f o n t r i b u t e a n y t h i n g t o t h e s y s t e m a n d t h u s o n l y s e n d r e q u e s t s , c a n n o t a n s w e r t h e m a n d a r e t h u s n e t u s e r s o f b a n d w i t h w h i l s t t h e r e o w n l i e s f a l l o w . I f y o u h a v e a r a r e s o n g , y o u w i l l b e p u m m e l e d . r a f e a t u r e f i l m . I n 1 9 9 6 v i d e o r e n t a l r e v e n u e s w e r e n e a r l y d o u b l e t h e b o x o f f i c e r e v e n u e s .
S o u r c e s : g o l d m a n S a c h s U s M e d i a R e s e a r c h , 1 9 9 8
e r c l a i m s t h a t p e o p l e a r e p r i c e - ( S o c r a f t a r i g h t o f a p p r o p r i a t i o n a g a i n s t c o m m e r c i a l c o m p e t i t o r s f o r t h r e e y e a r s , t i m e i t t a k e s t o a m o r t i z e f i r s t e d i t i o n ) . S c . w o r k s b y B l a s h i n i n t e r e s t s b e t w e e n p u b l i n , t o d a y ?
f a c i n g a t t e m p t s a t a t t e m p t s t o a c q u i r e e x c l u s i v e / H e c
W h i l s t t h e g o o d s t h e m s e l v e s m a y b e e n j o y e d i n a n o n r i v a l i n c l u s i v e m a n n e r , t h i s i s n o t t o s a y t h a t s u c h a s y s t e m c o u l d c o n t i n u e t o p r o d u c e w o r k s a t t h e s a m e r i g a t e o r q u a l i t y . I f i n c e s s a n t e x p r o p r i a t i o n e x t i n g u i s h e s a l l i n c e n t i v e t o c r e a t e n e w w o r k s , t h e n w e w o u l d c e r t a i n l y b e l e f t i n t r a g e d y .
S e c o n d l y , a s i l l u s t r a t e d i n t h e c a s e o f G n u t e l l a b e l o w , t h e r e i s a n o t h e r p o t e n t i a l t r a g e d y o f t h e c o m m o n s i n t h e d i g i t a l s p h e r e : a c o l l a p s e o f t h e d i s t r i b u t i o n i n f r a s t r u c t u r e , b r o u g h t b y t o o m a n y u s e r s w h o d o n o t c o
2 0
Lunasa - Dr Gilbert - Devils of Dublin
Blows your mind. On the verge of entering sublime Davy Spillane level 'One Day in June'. The uileann piper used to play monday nights in Mona's on Avenue B. I don't know if he still does. Lunasa are fuckin deadly, although I've only seen them once, ironically, on the WTC plaza on September 6 2001.....
Modem, 56 Kbps Cable, 512 Kbps T1, 2 Mbps
Picture,200 Kb 40 seconds 2 seconds 2 seconds
Music track, 4 Mb 13 min 30 seconds 1 minutes 15
seconds
Full-length movie, 400 Mb 22 hours 1 hour 45 minutes
25 minutes
Five-minute video clip, 20 Mb 1 hour 6 minutes 2
minutes
200 page Novel, 1 Mb 4 minutes 15 seconds 4 seconds
distributed distribution
Peer networks can be used to deliver the services
known as Content
Distribution Networks (CDNs), essentially comprising
the storage, retrieval
and dissemination of information. Companies such as
Akamai and Digital
Harbour have already achieved significant success
through installing
their own proprietary mdels of this function on a
global network level,
yet the same functions can be delivered by networks of
users even where
they have only a dial-up connection. Napster
constituted the first
instantiation of this potential and subsequent
generations of file-sharing
technology have delivered important advances in terms
of incrasing the
robustness and efficiency of such networks. In order
to understand the
role that peers can be play in this context we must
first examine the
factors which determine data flow rates in the network
in general.
The slow roll-out of broadband connections to home
users has
concentrated much attention on the problem of the
so-called 'last mile' in terms
of connectivity. Yet, the connection between the user
and their ISP is
but ne of four crucial variables deciding the rate at
which we access
the data sought. Problems of capacity exist at
multiple other points in
the network, and as the penetration of high speed
lines into the
'consumer' population increases these other
bottlenecks will becme more
apparent.
If the desired information is stored at a central
server the first
shackle on speed is the nature of the connection
between that server and
the internet backbone. Inadequate bandwidth or
attempts to access by an
unexpected number of clients making simultaneous
requests will handicap
transfer rates. This factor is known as the 'first
mile' problem and is
highlighted by instances such as the difficulty in
accessing
documentation released during the clinton impeachment
hearings and more
frequently by the 'slash-dot effect'.
In order to reach its destination the data must flow
across several
networks which are connected on the basis of what is
known as 'peering'
arrangements between the netwrks and faciltated by
routers which serve as
the interface. Link capacity tends to be underprovided
relative to
traffic leading to router queuing delays. As the
number of ISPs continues
to grow this problem is anticipated to remain as
whether links are
established is essentially an economic question.
The third point of congestion is located at the level
of the internet
backbone through which almost all traffic currently
passes at some
point. The backbones capacity is a function of its
cables and more
problematically its routers. A mismatch in the growth
of traffic and the pace of
technological advance in the area of router hardware
and software
package forwarding. As more data intensive trasfers
proliferate this
discrepancy between demand and capacity is further
exacerbated leading to
delays.
Only after negotiating these three congestion points
do we arrive at
delay imposed at the last mile.
What are the benchmarks to evaluate Quality of Service ("Typically, QoS is characterized by packet loss,
packet delay, time to first packet (time elapsed between a subscribe
request send and the start of stream), and jitter. Jitter is effectively
eliminated by a huge client side buffer [SJ95]."Deshpande, Hrishikesh; Bawa, Mayank; Garcia-Molina, Hector, Streaming Live Media over a Peer-to-Peer Network)
Current Technologies
Current Implementations
1. Storage Service Providers
descriptions of akamai freeflow hardware software mix:
algorithms plus machines
mapping server (fast to check hops to region) and
content server
http://www.wired.com/wired/archive/7.08/akamai_pr.html
sandpiper
applications
Akamai
13,000 network provider data centers locations edge
servers
click thru - 20%
- 10 - 15% abdonmnet rates
15% + order completion
- overladen web servers
- reduce delays
first static
now dynamic and customized (edge server)
fig.1 trad server
distributed server
illustrate delivery speed determinants
database/legacy ----- middleware ----- client browser
middle - performance/security/simplification of client
program
operation
IRAC
issue: cache management
TTL value
Issue: personalisation/cookie/cms driven content
Load Balancing
"Load balancing is a technique used to scale an Internet or other service by spreading the load of multiple requests over a large number of servers. Often load balancing is done transparently, using a so-called layer 4 router?." [wikipedia]
Lb Appliances
LB Software
LB Intelligent Switches
Traffic Distributors
Supernodes
Gnucleus Bearshare and Limewire are all compatible.
Cisco (DistributedDirector), GTE Internetworking
(which acquired BBN
and with it Genuity's Hopscotch), and Resonate
(Central Dispatch) have
been selling such solutions as installable software or
hardware. Digex
and GTE Internetworking (Web Advantage) offer hosting
that uses
intelligent load balancing and routing within a single
ISP. These work like
Akamai's and Sandpiper's services, but with a narrower
focus.
- wired
Data providers concerned to provide optimal delivey to
end users are
increasingly opting to use specialist services such as
Akamai to overcome
these problems. Akamai delivers faster content through
a combination of
propritary load balancing and distribution algorithms
combined with a
network of machines installed across hundreds of
networks where
popularily requested data will be cached. (11,689
servers across 821 networks
in 62 countries). This spead of servers allows the
obviation of much
congestion as the data is provided from the server
cache either on the
network itself (bypassing the peering and backbone
router problems and
mitigating that of the first mile) or the most
efficient available network
given load balancing requirements.
File Sharing Technologies
Popular file sharing utilities arose to satisfy a more
worldly demand
than the need to ameliorate infrastructural
shortfalls. When Shaun
Rhyder released his Napster client the intention was
to allow end-users to
share MP3 files through providing a centralised index
of all songs
available on the network at a given moment and the
ability for users to
connect to one another directly to receive the desired
file. Essentially
popular file sharing utilities enable content pooling.
Napser's legal
woes generated the necessary publicity to encourage
user adoption and for
new competitors to enter the market and to innovate
further. In the
following section I describe some of the later
generations of file sharing
software and chart their innovations which have
brought them into a
space of competition with Akamai et al.
Original implementation has been credited to [Justin
Frankel]? and [Tom
Pepper]? from a programming division of AOL
(then-recently purchased
Nullsoft Inc.) in 2000. On March 14th, the program was
made available for
download on Nullsoft's servers. The source code was to
be relased
later, supposedly under the GPL license. The event was
announced on
Slashdot, and thousands downloaded the program that
day. The next day, AOL
stopped the availability of the program over legal
concerns and restrained
the Nullsoft division from doing any further work on
the project. This
did not stop Gnutella; after a few days the protocol
had been reverse
engineered and compatible open source clones started
showing up. (from
Wikipedia)
[ENTER DESCRIPTION]
The greatest blind spot in McChesney’s analysis however concerns his silence on the issue of intellectual property. Thus, he devotes a section of his internet-chapter to examining the role played by a traditional media manufacturers in determining the contours of the new landscape, their advertising forecasts, their partnerships for the distribution of music, their ownership of high-profile brands etc. without so much as mentioning the important evolution which is taking place in file-sharing technology that is revolutionizing media distribution. What began as a basically centralized model vulnerable to legal attack (Napster) has evolved through at least two further generations. The Gnutella network (Bearshare/Limewire) represents the first, which is decentralized client server application. This allows a much more robust network in the sense that connectivity is not dependent on the legal health of a single operator. A trade-off with this is inefficiency in the locating of files and the problem of free riding users, which actually impede the functionality of the system beyond simply failing to contribute material. Limewire addresses this problem to some degree by providing the option to refuse to download files to users who do not share a threshold number of files. Unfortunately this cannot attenuate the problem of inefficient searches per se, merely offering a disciplinary instrument to force users to contribute. In order to sharpen search capacities in the context of a problematic network design, these networks have taken recourse to nominating certain nodes as super-peers, by virtue of the large number of files they are serving themselves. While essentially efficacious, the consequence is to undermine the legal robustness of the network. The threat is made clear in a paper published last year by researchers at PARC Xerox that analyzed traffic patterns over the Gnutella network and found that one per cent of nodes were supplying over ninety per cent of the files. These users are vulnerable to criminal prosecution under the no electronic theft act and the digital millennium copyright act. The music industry has been reluctant to invoke this form of action thusfar, principally because of their confidence that the scaling problem of the Gnutella community reduces the potential commercial harm it can inflict. As super-peering etc. becomes more effective this may change.
Another interesting attribute of the limewire system is the option it provides to set up virtual private networks, so that users can establish perimetered community based upon their own social affinities. Now this is the nightmare of the IP police.
Third generation file sharing systems begin with the Freenet architecture outlined by Ian Clarke in 1999. Although the Freenet network has not achieved anything like the same adoption scale as other systems, its design characteristics set the standard, which has been emulated by others, specifically those built on top of the ‘fast track’ system. The crux of Freenet’s genius is in its adoption of ‘small world’ organization. This refers to the experiment carried out by Milligram in the 1960s where 160 people throughout the United States were given letters to be delivered to stockbrokers and asked to pass them only through people that they knew to get them to their final destination. 42 of the letters arrived, using an average of 5.5 intermediaries. The purpose was to illustrate the level of social interconnectivity, and is an experience with which most us are familiar, as when one meets a stranger from a distant clime and discover that you know someone in common. It’s not that everyone has such an expansive social sphere, but rather that there are individuals whose circle of acquaintance cuts across a wide range of social groups. Freenet utilizes this principle through by giving its software a feature, which allows it to retain knowledge of the content available on other nodes; information is retained between sessions. The result is search capability an extremely effective storage and retrieval system. As a result this feature has been emulated by systems such as Audio Galaxy, Kazaa.
A crucial point in all of this is that both Gnutella and Freenet are open source/free software, thus allowing non-commercial motivated individuals and groups to take up the baton as the main players progressively move towards a rapprochement with industry. Napster has died attempting to placate its erstwhile enemies, whilst Kazaa will not allow downloads above 128 kilobytes per second in an attempt to appease the same industry, with whose representatives they are currently in negotiation for a license to move to a full commercial platform. These are both proprietary technologies so that they can exclude any rivalrous non-compliant competitors. Audio Galaxy however is under the General Public License. AG deals with the ‘tragedy of the commons’ in a more determined manner(!). Specifically, it only allows the user to transfer more than one file at a time if they are sharing a minimum of 25 files. Likewise, there is no option to not share – the only means of not sharing is to exit AG, which means of course that the user cannot download files either.
Similar systems are now been offered by these companies to commercial media distributors such as Cloudcast (Fasttrack) and Swarmcast, using technical devices to allow distributed downloads that automate transfer from other notes when one user logs off. The intention here is clearly the development of software based alternatives to the hardware offered by Akamai, the principle player in delivering accelerated downloads and used by CNN, Apple and ABC amongst others.
The point of all this is that there is distribution system available now that can allow the global distribution of critical media. This network is not globally inclusive and is predicated upon access to a telephone line, computer and (preferably) a high speed network connection, but other more powerful economic forces are driving the permeation of all these technologies so that this is a problem which will be progressively mitigated. In any case, exclusion is a fact of all media, whether one considers literacy (print), purchase capacity (television/satellite). Radio is probably fundamentally the most democratic media in an ideal sense, since the cost of acquisition of a receiver is relatively low, and the spread of linguistic range in the content available is basically quite comprehensive.
technical descriptions
napster
gnutella
fast track innovations
freenet
(search algorithms theodore hess)
Milligram anecdote
open source v proprietary
commerical implementations
swarmcast, cloudcast, upriser
The top four file-sharing systems -- FastTrack,
Audiogalaxy, iMesh, and Gnutella -- were used to
download 3.05 billion
files during August, according to Webnoize.
edonk: Client-server based sharing/chat network with
sophisticated
multi-source downloading (download from someone else
even when he's still
downloading the same file).
FastTrack -- the technology used by Consumer
Empowerment, one of the
companies sued on Wednesday --has seen traffic grow 60
percent a month
over the course of the year. With 970 million files
shared, it's the most
used file-trading application on the Internet.
The other three services -- Audiogalaxy, iMesh and
Gnutella -- had 2.08
billion files swapped using the decentralized
networks. While none of
the systems tops Napster's peak performance of 2.79
billion files
shared, industry experts believe it is only time
before these services
surpass Napster.
edonkey
sharereactor, filedonkey, filenexus
Economic Factors Influencing Peer Distribution
The motivation atttracting participation in these
networks remains that
which inspired Napster's inventor: the opportunity to
acquire
practically unlimited content. Early in the growth of
Napster's popularity users
realised that other types of files could be exchanged
apart from music,
as all that was required was a straightforward
alteration of the naming
protocal such that the file appeared to be an MP3
(Unwrapper). Later
applications were explicitly intended to facilitate
the sharing of other
media such that that today huge numbers of films,
television programs,
books, animations, pornography of every description,
games and software
are available. The promise of such goodies is
obvuiously an adequate
incentive for users to search, select and install a
client server
application and to acquire the knowledge necessary to
its operation. Inuitive
Graphical User Interfaces enable a fairly rapid
learning curve in
addition to which a myriad of users discussion forums,
weblogs and news
groups provide all that the curious or perplexed could
demand.
Internet access pricing plans obviously the key
determinant.
Motivation
- performance
- access to goods in kind
Whilst it is obvious why users utilise these tools to
extract material,
it is not so plain why they should also use them to
provide material in
turn to others and avoid a tragedy of the commons. Key
to the
willingness to provide bandwidth has been the
availability of cable and DSL
lines which provide capacity in excess of most
individuals needs at a flat
rate cost. There is thus no correlation between the
amount of bandwidth
used and the price paid, so in brief there is no
obvious financial cost
to the provider. In areas where there are total
transfer caps or use is
on a strictly metered basis participation is lower for
the same reason.
For those on flat-pricing packages there are some
costs imposed, such
as a slow-down in www access rate. A combination of
these factors has
given rise to free-riding problems as evidenced by the
study carried out
by researchers at PARC Xerox on the composition of the
Gnutella network
[ENTER MORE DATA]. There is a fairly high degree of
consciousness of
this problem however (such users are referred to as
'leeches' and are the
subject of endless vitriol on file-sharing boards) and
many
applications have implemented features to address the
issue, a matter o which we
will return to below under the rubric of collective
action mechanisms.
Dangers
appropriation
Fill in story about Morpheus switch from fasttarck to gnutella
free riding
h. Freeriding and Gnutella: The Return of the Tragedy of the Commons: Bandwidth, crisis of P2P, tragedy of the commons, Napster's coming difficulty with a business plan and Mojo Karma. Doing things the freenet way.
Eyton Adar & Bernardo Huberman (2000)
Hypothesis 1: A significant portion of Gnutella peers are free riders.
Hypothesis 2: Free riders are distributed evenly across different domains (and by speed of their network connections).
Hypothesis 3: Peers that provide files for download are not necessarily those from which files are downloaded.
" In a general social dilemma, a group of people attempts to utilize a common good in the absence of central authority. In the case of a system like Gnutella, one common good is the provision of a very large library of files, music and other documents to the user community. Another might be the shared bandwidth in the system. The dilemma for each individual is then to either contribute to the common good, or to shirk and free ride on the work of others.
Since files on Gnutella are treated like a public good and the users are not charged in proportion to their use, it appears rational for people to download music files without contributing by making their own files accessible to other users. Because every individual can reason this way and free ride on the efforts of others, the whole system's performance can degrade considerably, which makes everyone worse off - the tragedy of the digital commons ."
Figure 1 illustrates the number of files shared by each of the 33,335 peers we counted in our measurement. The sites are rank ordered (i.e. sorted by the number of files they offer) from left to right. These results indicate that 22,084, or approximately 66%, of the peers share no files, and that 24,347 or 73% share ten or less files.
The top Share As percent of the whole 333 hosts (1%) 1,142,645 37% 1,667 hosts (5%)2,182,08770%3,334 hosts (10%) 2,692,082 87% 5,000 hosts (15%)2,928,90594%6,667 hosts (20%)3,037,23298%8,333 hosts (25%)3,082,57299%Table 1
And providing files actually downloaded?
Again, we measured a considerable amount of free riding on the Gnutella network. Out of the sample set, 7,349 peers, or approximately 63%, never provided a query response. These were hosts that in theory had files to share but never responded to queries (most likely because they didn't provide "desirable" files).
Figure 2 illustrates the data by depicting the rank ordering of these sites versus the number of query responses each host provided. We again see a rapid decline in the responses as a function of the rank, indicating that very few sites do the bulk of the work. Of the 11,585 sharing hosts the top 1 percent of sites provides nearly 47% of all answers, and the top 25 percent provide 98%.
Quality?
We found the degree to which queries are concentrated through a separate set of experiments in which we recorded a set of 202,509 Gnutella queries. The top 1 percent of those queries accounted for 37% of the total queries on the Gnutella network. The top 25 percent account for over 75% of the total queries. In reality these values are even higher due to the equivalence of queries ("britney spears" vs. "spears britney").
Tragedy?
First, peers that provide files are set to only handle some limited number of connections for file download. This limit can essentially be considered a bandwidth limitation of the hosts. Now imagine that there are only a few hosts that provide responses to most file requests (as was illustrated in the results section). As the connections to these peers is limited they will rapidly become saturated and remain so, thus preventing the bulk of the population from retrieving content from them.
A second way in which quality of service degrades is through the impact of additional hosts on the search horizon. The search horizon is the farthest set of hosts reachable by a search request. For example, with a time-to-live of five, search messages will reach at most peers that are five hops away. Any host that is six hops away is unreachable and therefore outside the horizon. As the number of peers in Gnutella increases more and more hosts are pushed outside the search horizon and files held by those hosts become beyond reach.
Easily isolated providers are set up for litigation by the RIAA etc.
Solutions?
i. In the "old days" of the modem-based bulletin board services (BBS), users were required to upload files to the bulletin board before they were able to download.
ii. FreeNet, for example, forces caching of downloaded files in various hosts. This allows for replication of data in the network forcing those who are on the network to provide shared files.
iii. Another possible solution to this problem is the transformation of what is effectively a public good into a private one. This can be accomplished by setting up a market-based architecture that allows peers to buy and sell computer processing resources, very much in the spirit in which Spawn was created
trust
- collective action mechanisms
- hashing
Security and privacy threats constitute other elements
deterring
participation both for reasons relating to users
normative beliefs opposed to
surveillance and fear of system penetration by
untrustworthy daemons.
The security question has recently been scrutinised in
light of the
revelation that the popular application Kazaa had been
packaging a utility
for distributed processing known as Brilliant Digital
in their
installer package. Although unused thusfar it emerged
that there was the
potential for it to be activated in the future without
the knowledge of the
end-user.
Viruses
.vbs and .exe files can be excluded from searches.
MP3s etc are data
not executables.
Virus spreads via Kazaa (but the article wrongly identifies it as a worm): http://www.bitdefender.com/press/ref2706.php
Audio Galaxy: Contains really ugly webHancer spyware
that may make your
Internet connection unusable.
Other Costs
CPU Resources
Kazaa supernode will use a max of 10% of total CPU
resources. Allowa na
opt-out.
Commercial Implementations
According to study executed in early 2001 by Viant
consulting there are
now more than 500, 000 television and film files being
exchanged every
day over file sharing networks and through connections
made in IRC [tcc
p.16 for stats and methodology]. That this is bad news
for the
copyright owners will not be explored here, rather the
fact that this form of
P2P provision of the archetypal data heavy content is
taking place
between users already. In the same reprot the authors
assert that content
companies have themselves been experimenting with the
distributional
potential of networks such as gnutella. (Viant, The
Copyright Crusade see
fn 47).
interesting comparisan of acquisition times in TCC at
p. 28
http://www.badblue.com/w020408.htm
http://www.gnumarkets.com/
commerical implementations
swarmcast, cloudcast, upriser
mojo nation's market in distributed CDN.
Design considerations impeding performance for the
sake of other
normative objectives
freenet - censorship resistance impediment
tangler
kazaa/morpheus bitrate encoding limit for copyright
reasns, easily
hacked.
Open Source or locked up?
closed
Bearshare
Kazaa
Grokster
Edonkey
Open
Limewire
GPL
Gnucleus
Collective Action Mechanisms
Limewire
Slots & Bandwidth Throttling
Gnotella (Windows)
Easy-to-use and very popular client written in VB,
with many powerful
features (search & response filtering, decent
bandwidth regulation,
multiple searches, private networks, skins..).
Limewire: Upload slots represent the number of files
other users can
download from you at any one time. The default number
of slots varies
based upon the connection speed you set at
installation, and the default
bandwidth usage is set at 50 percent of your
connection speed. You can
self-configure your number of upload slots and
percentage of bandwidth
usage by clicking on tools>options>uploads.
Gnucleus
Another new feature is Scheduling, which lets you tell
Gnucleus to run
on the Gnutella network at certain times during the
day. This is useful
for people who want to run Gnucleus only at times when
the load on
their local network is low, like at a college someone
might configure
Gnucleus to run at night so during the day academic
use of the network would
not be bogged down. Or at a company so day-time
business traffic would
not be affected.
storage
Swarmed Downloads
LimeWire 2. First of all, we've allowed for 'swarmed'
downloads. If the
file you are looking for can be located at multiple
hosts, LimeWire
will attempt simultaneous downloads from those
sources, spidering
different portions of the file. Consequently, you'll
get your files MUCH faster
than what you are used to.
Multi-Source Downloads
multi-source downloading?
A: A particular file may be available on more than one
remote computer.
With multi-source downloading these various sources
are grouped
together, and if one sources fails for some reason
then another host can take
its place.
Importance of Hashing and CPU consumption
BearShare hashes all your existing files when
launched. This is a
one-time activity and should not consume more than 25%
CPU utilization.
Q: What is hashing?
A: Hashing is a calculation done on each file to
produce a small,
unique "hash". BearShare compares hashes to determine
if two files are
identical. It is important to do this sort of
comparison to guarantee that
the files being compared are the same, especially when
swarming.
Superpeering and the erosion of pure peer to peer
Early 2001
Limewire
new Gnutella hierarchy, whereby high performance
machines become
'Ultrapeers'. These machines accept connections from
many LimeWire clients
while also connecting to the rest of the Gnutella
network. Moreover, the
Ultrapeer shields these 'regular' LimeWire clients
from the CPU and
bandwidth requirements associated with Gnutella,
directing traffic to
clients in an efficient manner.
: Any KaZaA Media Desktop can become a SuperNode if
they have a modern
computer and are accessing the Internet witha
broadband connection.
Being a
SuperNode does not affect your performance noticeable.
Other KaZaA
users in your neighbourhood, using the same Internet
Service provider or
located in the same region as you, will automatically
upload a small list
of files they are sharing
to you. When they are searching the will send the
search request to you
as a
SuperNode. The actual download will be directly from
the computer who
is sharing the file and the persons who is downloading
the file,
peer-to-peer.
Retrival
Connections
Every connection costs bandwidth of approximatels .5k
per second
Smart Downloading
Smart downloading will retry a given download until
it is successful.
In other words, if you have tried to retrieve a file
from a similar
group of files, then LimeWire will try to download any
of these sources
until itÕs successful. Will also auto resume if
interrupted.
Search Considerations
Search and response take place over the same route,
that's why
methodology is so important. The fiole is then
transferred directly using a
HTTP interface
Exclusion
If there are particular IP addresses you wish to
ignore (if, for
example, a particular IP address was sending you
unsolicited results), click
under Hosts where you could enter that IP address into
the 'Ignore
these hosts' window and click Add.
Q.? Is there any means for these networks to allow
prioritization of
files stored on the same network or on a network
nearby so as to minimize
the need to travel over the backbone, through multiple
peering
interfaces etc?
AG: The Satellite automatically selects the closest
user with the file
you want, reducing external bandwidth usage.
Kazaa Automatically clusters traffic by network
topology to provide
fastest download speed and minimal load on ISP
backbones.
Sharing Incentives
Default sharing: Limewire: automatic sharing of
downloaded files.
Limewire also allows you to require a minimum number
of shared files
before allowing a download to a given user.
AG You must first share at least 25 files to be able
to increase the
number of simultaneous transfers you can have.
distribution - load balancing
existing solutions
dedicated servers
web caching
expanding hard disk memory size
necessary preconditions
flat pricing?
broadband figures [see TCC fn 9]
- take up
- proportion of capacity utilised
dial up connections
caching
server farms
mirrors
economic redundancy in uncalibrated approach
cost of server collocation
cost of memory
speed of memory capacity growth
eDonkey is a client-server-based file sharing network
It also means that in the moment you start a download,
you may already
be uploading the very same file to someone else. This
also fixes the
freeloader problem since even if you don't share any
files, your
bandwidth is always put to good use.
you only tell the donkey how much bandwidth you want
to use for uploads
and for downloads.
www.sharereactor.com
cosmoed2k.da.ru
"The VCR is to the American film producer and the
American public as
the Boston Strangler is to the woman alone."
- Jack
Valenti, MPAA
------------------
*****************
Three commercial online music suppliers
Pressplay
Musicnet
Full Audio/Clear Channel
MusicMatch, MusicNet and FullAudio don't permit
burning
Pressplay, Emusic, Rapsody burning allowed
http://www.neo-modus.com/?page=News
http://www.neo-modus.com/?page=Help
http://www.climate-dynamics.rl.ac.uk/
FastTrack is the file-trading software being used by
Consumer Empowerment, which licenses its technology to
Kazaa.com,
Grokster
and MusicCity.
Fraud. Several prolific warez kiddies figured out how
to change their MAC address to bill
their service to their neighbors or even to our own
router (!). We're still not sure exactly
how that happened.
Sure, we cut them off and connected their modems to a
high voltage source as punishment (our
contract allowed it), but how many more are there who
we didn't catch?
Billing issues. People who obviously ran up a very
high bandwidth bill would call us and
complain when they got their statements,
asking us to lower their bills. Our position was that
it wasn't our responsibility that
they couldn't figure out how to close Napster or stop
downloading porn. When they paid with
credit card we would sometimes lose the dispute, but
things were okay when they paid with cash or check.
Expect
ation of quality. As you know, a cable modem is a
shared medium and cable companies are not at fault
for your neighbors' downloading habits.
However, it was considered a potential legal liability
to be providing a service of varying quality.
Modem, 56 Kbps Cable, 512 Kbps T1, 2 Mbps
Picture,200 Kb 40 seconds 2 seconds 2 seconds
Music track, 4 Mb 13 min 30 seconds 1 minutes 15
seconds
Full-length movie, 400 Mb 22 hours 1 hour 45 minutes
25 minutes
Five-minute video clip, 20 Mb 1 hour 6 minutes 2
minutes
200 page Novel, 1 Mb 4 minutes 15 seconds 4 seconds
For example, a T3/DS3 connection
has a capacity of 45 Mbps, while a stream with 30 fps, at 320x240 pixels
can have a rate of 1 Mbps. Under such conditions, only 45 clients can be
provided a maximum resolution video stream.
A Haclab in the Zapatista Rebel Secondary School
El Primero de Enero "Escuela Secondaria Rebelde Autonomo Zapatista" is the first autonomous secondary schoolo in Chiapas. It is located in the educational and cultural centre of the Zapatista movement in Oventic Aguascalientes II. There are about 150 students and 25 teachers. In the cultural center there is neither telephone or internet. But there are some computers in the school, and from there we're going to start.
Aim of the Project
In the first place we're going to get to know a community. To learn their way of constituting spaces for life, social and cultual autonomy. Then we're going to assist them, in so far as we can. Wwe will install Linux, as they themselves have requested, on thePCs they already have in their laboratory, and we will teach them to install and use it, so that they can then teach others.
To be autonomous is also to be independent of the monopolies on knowledge. We will assemble a local network and install some basic servers (web server, mail server etc.) All the while explaining o them what we're doing and how it is done.
We'll teach them how to make a website. Each phase will will be conducted using the workshop method, or if you prefer, learning while doing, praphraseing the zpatista slogan "asking while walking".
A Project in several steps
The projevt will not finish with this first trip. We will work so that we can carry on the collaboration, setting other objectives.
With each voyage we will build a new tool, from time to time singling out what is most useful, together with the community.
For sure one of the objectives is a connection to the internet.
We will try to establish collaborations with other groups and organisations that are working on solidarity projects with the community in Chiapas, to manage to connect them to the net. In this first phase we can think of a portable machine configured at least as a mail server. In this way, once a day they could go to the closest city (San Cristobal is about an hour's drive from Oventic) connect the notebook to the network, and thus send and receive from and for the School.
The Meeting between Communities
We hope that this will be just the first moment of meeting between the italian community that is drawn together around the organisation of the annual hackmeeting [http://www.hackmeeting.org] and the community of Aguascalientes in Chiapas.
Let's be quite clear that this project ought not be closed to others' particpation, but that it seems to us that the attitude of the italian hackers towards building tools, of sharing knowledge "without founding powers" goes very well with the attitude of the Zapatistas to struggle for one's own autonomy and for the right to choose one's own destiny.
And a working proposal that we address to all those who are putting into play activities related to the use of technology in a conscious manner; to whoever sees in the use of digital instruments a possibility to get away from the commercial logic of the technology and media multinationals. In this sense we will try to involve also Indymedia Chiapas in this process, which does great work in San Cristobal.
How you can help us
We need support at various levels: hardware [the computers in the labs are not in good condition], money to buy some equipment on the spot and to guaranteee the maintenance of the lab, ideas and advice, which are always most appreciated.
Hardware
UTP Cat 5 Cable
PCI and /or ISA network cards
10/100 Hub
Working Hard Disks
Ram
Video Cards
CD Rome readers
Portables
Scanner
Printer
Money
You can contribute your cash by payment to
C.C.P. 61804001 made out to
Radio Onda Rossa,
Via dei Volsci 56
00185 Roma,
specifying as description of payment
"Hacklab Project in Oventic"
Ideas and Advice
ezr@autistici
If there is a single reader of this turgid notebook who still actually lives there. Otherwise the details recounted within are only fuel for the exiles' schadenfreude.
Irish Internet Users group
the spirit of the time, growing slowly and quietly ripe for the new form it is to assume, disintegrates one fragment after another of the structure of its previous world. That it is tottering to its fall is indicated only by symptoms here and there. Frivolity and again ennui, which are spreading in the established order of things, the undefined foreboding of something unknown-all these betoken that there is something else approaching. This gradual crumbling to pieces, which did not alter the general look and aspect of the whole, is interrupted by the sunrise, which, in a flash and at a single stroke, brings to -view the form and structure of the new world.
Hegel Preface to the Phenomenology of the mind
Some thoughts on media criticism.
The Decline and Fall of Public Broadcasting, David Barsamian.
Rich Media, Poor Democracy, Communications Politics in Dubious Times, Robert W. McChesney
As a stalwart veteran of alternative media production through his radio work based out of Boulder, Colorado, and his accessible interviews with Said and Chomsky published in print by South End Press, one would expect Barsamian to have an insightful critique of the contemporary media landscape. But despite its many qualities, this book proved something of a disappointment. Essentially the text falls victim to a series of flaws, several of which have come to characterize leftist attempts at media critique.
The first of these is a simplistic revising of the history of the FCC which would lead one to believe that its purpose also the originally honorable has now been perverted. Thus Barsamian writes:
"The New Deal-era politicians who created the FCC in 1934 viewed the communications infrastructure as a vital public resource, like utilities or the railroads. The agency was originally charged with protecting consumers from industry monopolies and price gouging. But a September-October 2001 Mother Jones article, "Losing Signal" by Brendan L. Koerner, exposes an emergent corporate-friendly FCC, as embodied by Powell."
First off this is a strange way to describe an agency established by the 1934 Communications Act as originally having had a consumer protection design. In fact, the 1934 law as the final act in a government operation undertaken in collaboration with commercial broadcasters to marginalise non-commercial broadcasters and create the circumstances which would allow an advertising driven conception of programming to develop a stranglehold over the US audience. This process had begun in 1927 with the Radio Act that created the Federal Radio Commission, an institution that rendered non-commercial broadcasters unsustainable through imposition of costly technical requirements. On this point see McChesney, telecommunication... and democracy. The squashing of FM radio in the 1940s and cable in the 1960s provides examples historically the FCC has been the vehicle through which incumbent players have held innovative technologies threatening their position at bay. The inventor of FM was so dispirited by this that he threw himself out the window of his apartment in 1954. The most recent instantiation of this pattern is the thwarting of Low Power radio by commercial broadcast interests in combination with the doyen of public broadcasting, NPR, which was among the final acts of the Clinton administration in December 2000 (See Hazlett p.46 - 90). On the movement for and potential of Microradio & Democracy: (Low) Power to the people, Greg Ruggiero, Seven Stories Press, 1999.
A sad reflection on the poverty of leftist critique is that a critical history of the FCC is produced with far more accuracy and panache by the libertarian right as exemplified by the Brooking's Institute Tom Hazlett (himself a former FCC economist) in his paper 'The Wireless Craze, The unlimited bandwidth myth, The Spectrum Auction Faux pas and the Punchline to Ronald Coase's 'Big Joke' - an essay on Airwave allocation policy.' Obviously Hazlett has other aims, namely the promotion of a policy built on maximal property rights and market mechanisms, yet perversely he succeeds also in producing something useful to left observers. Hazlett provides a sobering narrative of the origins of radio regulation (see especially his reconstruction of the period from 1926-27 which saw the 'breakdown of the law' at p.97 -100) and a skeptical outline of the protean content of both the 'public interest' and the 'fairness doctrine', two foundational legal doctrines upon which leftists would build their 'progressivist' new Jerusalem not knowing that this is a base of sand rather than stone. Notably, Hazlett has no compunction in using technical data supplied by Rappaport et al. from the media foundation and essentially lifts important sections of his historical analysis from Chesney. His section on LPFM particularly could be easily inserted in the contents of a Z magazine without drawing controversial comment.
Of course there are moments which might attract the more attentive readers suspicions. Political economy and the corruption of institutions are not something to which Hazlett is blind, thus he argues for market processes as a means of diminishing their influence. For Hazlett, Markets indeed are synonymous with the only type of ' democracy'; he conceives as practicable.
Secondly, in order that the value of spectrum is maximized he advocates maximal property rights. For leftists of course this is high blasphemy to the 'public' who are the 'owners' of the spectrum, and thus actually selling would constitute the abandonment of the public trust and a handover to corporate interests of the media on a massive scale. The hard truth is that leftists are so demoralized that the only means they see to advance their political objectives is through then ideological instrumentalising of the state to impose order on capital by authoritarian means. Yet their public ownership of the airwaves dream-sequence is just that, a daydream based on hopelessness. Broadcasters understand their relationship with politicians very well, and vice versa: they are each other’s very oxygen. Broadcasters provide politicians with publicity and frame issues so as to exclude the merest possibility of popular self-government and more specifically with cash to finance their political campaigns. In return, politicians grant them all the laws they desire to copperfasten their control over the public's attention through extension of intellectual property laws, spectrum giveaways (such as the estimated 60 billion dollars of spectrum given to television network incumbents under the 1996 telecommunications act. Thus, on a fair reading of the evidence, Hazlett's claim stands, even if one differs with his concept of preferential outcomes as produced by a property regime. The wisdom of exclusive rights now forms the crux of the polemic between those with the stomach to involve themselves in the opaque world of spectrum licensing; influential critiques have been made by Naom (on the basis of price efficiency), Lessig (on the basis of abundance) benkler (on the basis of abundance and the potential for a democratic decentralised model).
In order to counter Hazlett at root requires a serious examination of two aspects of markets that his utopianism neglects to scrutinize. The first is the question of externalities, that is the positive and negative byproducts of a process that whose costs and benefits are not taken into account in the price setting mechanism. These may be positive or negative. In the area of information, a positive externality is considered to be its effect of informing social actors and equipping them with the knowledge to make informed decisions about their lives. The same information may also help to encourage technical innovation, or can have a negative effect through phenomena such as desensitization to violence. In the area of information goods, one form of externality that has received considerable attention in recent years is that of network effects, contexts where each additional user of a specific standard of equipment or software platform for example, adds value to the other users experience or capability. This issue lies at the heart of the controversy over Microsoft’s antics to maintain and extend its dominance.
The other critique which goes to the heart of market optimists claims is the fact that most markets do not work along a simple supply/demand matrix, principally because eof inequalities in bargaining power between market actors, and the ability of deep-pocketed organizations to manufacture demand through marketing etc. This is the context in which a reading Manuel deLanda’s ‘Markets and Anti-Markets’ is very useful, turning as it does the market critique in on itself.
Spectrum allocation remains basically void of social critics with the exception of critical legal commentators and Robert McChesney, whom for reasons later elaborated spoils his performance by ignoring the impact of new technologies and the new centrality of intellectual property law.
The abiding legacy of the 1990s is a paradox, which is that at the very time when media ownership has scaled to what would once have been though unthinkable levels, the variety of conduits for the dissemination of mass media has diversified enormously. Decentralized protocols such as TCP/IP, which lie at the heart of the Internet, have also enabled the emergence of high-speed wireless networks. The telephone infrastructure is now ripe for the delivery of television programming to the home. Cable lines have been basically opened up for consumers to become producers of information themselves. Satellite systems are trying to get in on the action. At the root of all these things lies packet switching technology and digitalization. While dystopian curmudgeons take glee in the meltdown of the dotcom boom, they are missing the essence of the change, refracted in their eyes through the yuppie culture that was the period’s concubine. Yet radical groups have taken advantage of these tools and to Barsamian's credit, although he does not appear too closely informed on the matter, he does dedicate a chapter to Independent Media Alternatives, focussing bon Z magazine' transition to a digital portal, the Indymedia network and democracy Now. These examples are a pretty reasonable place to start the discussion as what has happened.
Treating the Indymedia network as a whole is a misleading exercise due to the pronounced decentralization behind it. Beyond the trademark layout of the interface and use of logos of a somewhat similar style, at one point it could be considered as a piece as each node despite its editorial independence functioned on the same software Active X. This code was developed in advance of the WTO meeting in Seattle 1999, and was designed to allow for maximum popular participation in coverage of the protests. To minimize time between observation and reportage, and to give effect to principles of DIY media and free speech, the defining characteristic of the interface and upload mechanism was to allow for unfiltered open publishing. Nowadays this identity is no more, as at least five different pieces of software (Active, Active X, slashcode, python and java) are used by different groups.
In addition, the notion of an unfiltered newswire has been jettisoned by many nodes, particularly those in Europe where liberal notions of free speech have been overwhelmed by leftist canons such as anti-fascism; thus all fascist posts are removed on the German and Italian sites, and given the history of those two countries this is hardly surprising although not uncontroversial.
This diversity in infrastructure has positive and negative elements. The advantage is that there are multiple different sets of code in development and addressing problems in different ways at any given time, the correlative disadvantage is that the attentions of politicized coders are divided and the threat of platform incompatibility is ever present.
McChesney’s book has several merits, one amongst them being this corrective lesson in the history of the FCC. Elsewhere, one of his key concerns is a dissection of the claim that the advent of the Internet will lead to a democratization of economic life and the media in particular. The concession of control to commercial forces and forms of private governance is fingered as the main reason why it is unlikely for ‘the Internet to set us free.’ This section of the book combines a healthy skepticism of the ‘democratic’ nature of the free market and a cynical eye on its more populist exponents, a little a la Thomas Frank and the Baffler. More importantly perhaps, he highlights the key role played by the state in the development of the technology and the regulatory influence exerted over markets in the form of monopoly grants, such as in the case of cable systems. This point is even more relevant to the discussion about intellectual property, and it’s an observation made elsewhere and with great insight by James Boyle. Yet there is an infuriating lack of substantive explanation; thus, when he claims that:
Business interests are coming to play the dominant role in establishing web technical standards, which are crucial to expediting electronic commerce.”
Now this is obviously true, and there is ample documentary evidence on the point, but the problem is that it is absent from McChesney’s text, and one needs to look elsewhere, such as the works of Larry Lessig to find a useful articulation of the point.
Another aspect of the picture neglected by McChesney is the real tension that exists between different players in the online environment. According to his theory, modern capitalism is defined by the attempts of large conglomerates to acquire an oligopolistic position in the market and milk the rent available to the maximum degree through means other than price competition. Thus the author traces a seemingly unending taxonomy of strategic partnerships, joint ventures and cross-investment between the major industry players in recent years, agents that according to the market prescription ought to be practically cutting one another’s throats with competitive enthusiasm. The problem is that this ignores some of the technological changes that really have disintegrated some of the walls between markets. For example, wireless networking through components built into hardware can erode the power of the owners of the telecommunications infrastructure. Thus apple builds in Airport to their systems and Nokia integrate networking protocols into their wireless devices. While this example exists at the level of the delivery conduit, similar processes have been at work over the years in content – think about the fact that the telecommunications companies make money when you download ‘free’ MP3s. Likewise Sony litigated for nearly seven years to fend off the notion pictures studios attempts to outlaw the video recorder, in a case that foreshadowed many of today’s conflicts between consumer electronics companies and media proprietors. Of course Sony is no in the ‘content’ business too, but that doesn’t stop Apple, Diamond etc. for building machines designed to capitalize on mass practices of copyright infringement. McChesney perceives many of these companies to represent the same agenda ‘the commercialization of the Internet’, when there are in fact important divergences in their interests, divergences which create space in which other processes can develop. The now classic example of this is the Music Industry and MP3s, where because the technology had enough time to develop, there is now a legacy of millions of MP3s that simply can’t be put back in the bottle, irrespective of what happens to the next generation of devices, many of which will probably integrate copyright management systems designed to thwart ‘piracy’.
The other force deflating the liberatory potential on the network in McChesney’s eyes is the centrality of the e-commerce portals as defined principally by AOL, and to a lesser degree by the likes of MSN and Yahoo.
“… or an inside the beltway public interest lobbyist so used to being ignored that just seeing the words ‘public interest’ in a government report was a life-defining event. (158)
The greatest blind spot in McChesney’s analysis however concerns his silence on the issue of intellectual property. Thus, he devotes a section of his internet-chapter to examining the role played by a traditional media manufacturers in determining the contours of the new landscape, their advertising forecasts, their partnerships for the distribution of music, their ownership of high-profile brands etc. without so much as mentioning the important evolution which is taking place in file-sharing technology that is revolutionizing media distribution. What began as a basically centralized model vulnerable to legal attack (Napster) has evolved through at least two further generations. The Gnutella network (Bearshare/Limewire) represents the first, which is decentralized client server application. This allows a much more robust network in the sense that connectivity is not dependent on the legal health of a single operator. A trade-off with this is inefficiency in the locating of files and the problem of free riding users, which actually impede the functionality of the system beyond simply failing to contribute material. Limewire addresses this problem to some degree by providing the option to refuse to download files to users who do not share a threshold number of files. Unfortunately this cannot attenuate the problem of inefficient searches per se, merely offering a disciplinary instrument to force users to contribute. In order to sharpen search capacities in the context of a problematic network design, these networks have taken recourse to nominating certain nodes as super-peers, by virtue of the large number of files they are serving themselves. While essentially efficacious, the consequence is to undermine the legal robustness of the network. The threat is made clear in a paper published last year by researchers at PARC Xerox that analyzed traffic patterns over the Gnutella network and found that one per cent of nodes were supplying over ninety per cent of the files. These users are vulnerable to criminal prosecution under the no electronic theft act and the digital millennium copyright act. The music industry has been reluctant to invoke this form of action thusfar, principally because of their confidence that the scaling problem of the Gnutella community reduces the potential commercial harm it can inflict. As super-peering etc. becomes more effective this may change.
Another interesting attribute of the limewire system is the option it provides to set up virtual private networks, so that users can establish perimetered community based upon their own social affinities. Now this is the nightmare of the IP police.
Third generation file sharing systems begin with the Freenet architecture outlined by Ian Clarke in 1999. Although the Freenet network has not achieved anything like the same adoption scale as other systems, its design characteristics set the standard, which has been emulated by others, specifically those built on top of the ‘fast track’ system. The crux of Freenet’s genius is in its adoption of ‘small world’ organization. This refers to the experiment carried out by Milligram in the 1960s where 160 people throughout the United States were given letters to be delivered to stockbrokers and asked to pass them only through people that they knew to get them to their final destination. 42 of the letters arrived, using an average of 5.5 intermediaries. The purpose was to illustrate the level of social interconnectivity, and is an experience with which most us are familiar, as when one meets a stranger from a distant clime and discover that you know someone in common. It’s not that everyone has such an expansive social sphere, but rather that there are individuals whose circle of acquaintance cuts across a wide range of social groups. Freenet utilizes this principle through by giving its software a feature, which allows it to retain knowledge of the content available on other nodes; information is retained between sessions. The result is search capability an extremely effective storage and retrieval system. As a result this feature has been emulated by systems such as Audio Galaxy, Kazaa.
A crucial point in all of this is that both Gnutella and Freenet are open source/free software, thus allowing non-commercial motivated individuals and groups to take up the baton as the main players progressively move towards a rapprochement with industry. Napster has died attempting to placate its erstwhile enemies, whilst Kazaa will not allow downloads above 128 kilobytes per second in an attempt to appease the same industry, with whose representatives they are currently in negotiation for a license to move to a full commercial platform. These are both proprietary technologies so that they can exclude any rivalrous non-compliant competitors. Audio Galaxy however is under the General Public License. AG deals with the ‘tragedy of the commons’ in a more determined manner(!). Specifically, it only allows the user to transfer more than one file at a time if they are sharing a minimum of 25 files. Likewise, there is no option to not share – the only means of not sharing is to exit AG, which means of course that the user cannot download files either.
Similar systems are now been offered by these companies to commercial media distributors such as Cloudcast (Fasttrack) and Swarmcast, using technical devices to allow distributed downloads that automate transfer from other notes when one user logs off. The intention here is clearly the development of software based alternatives to the hardware offered by Akamai, the principle player in delivering accelerated downloads and used by CNN, Apple and ABC amongst others.
The point of all this is that there is distribution system available now that can allow the global distribution of critical media. This network is not globally inclusive and is predicated upon access to a telephone line, computer and (preferably) a high speed network connection, but other more powerful economic forces are driving the permeation of all these technologies so that this is a problem which will be progressively mitigated. In any case, exclusion is a fact of all media, whether one considers literacy (print), purchase capacity (television/satellite). Radio is probably fundamentally the most democratic media in an ideal sense, since the cost of acquisition of a receiver is relatively low, and the spread of linguistic range in the content available is basically quite comprehensive.
An unfortunate fact on the other hand is that in the affluent west where the technology is accessible, it is not been taken up. A cursory search over Gnutella and Kazaa for files tagged indymedia (for example) turned up a grand total of one result: an audio track from a Ralph Nader rally….. On Audio Galaxy, one file was found… an audio recording of a lecture by Richard Stallman! This represents failure, a deep failure of ambition. But behind this, there is another problem, namely that most producers of critical audio-visual programming continue to refuse to make their works available digitally due to an outmoded belief in the copyright orthodoxy: they simply cannot conceive of any other way of receiving remuneration for their work, and thus they decide to limit dissemination to physical world networks where they can leverage distribution in exchange for fixed price payment. (insert note on italy indy and software which allows you to break up large files).
And the enormous resources devoted to fast-tracking legislation expanding and reinforcing copyright, trademark and patent monopoly rights. This appears anomalous when set aside the attention a given to the 1996 telecommunications act and the symbiotic nature of the relationship between politicians and corporate media generally in the book. In total, apart from a passing reference to the passage of the 1996 WIPO Copyright treaty, only one paragraph is allotted to the issue (179). In the mid-term, such a lacuna is even more striking as in a very real way it involves bringing both state and private forces of control and enforcement into the domestic sphere of everyday America.
Despite these weaknesses Rich Media… is an important historical contribution, and especially noteworthy in this respect is the attention paid to the phony Gore commission investigation into public interest standards for broadcasters in the aftermath of their successful customization of the 1996 Telecommunications act and the now infamous HDTV giveaway. The Gore Commission effectively functioned as a mechanism for laundering the ill-gotten gains under a rhetoric of public interest which mandated nothing literally nothing, and coming past-factum to the grant of the licenses to the broadcasters by the FCC couldn’t have been anything but impotent irrespective of the nature of its recommendations. In addition, despite a fundamental disagreement with his optimism in state orchestrated reform, McChesney’s concluding chapter conveys a confidence in ordinary people, and a healthy disrespect for paternalistic liberals, in a genuinely populist form that is unfortunately rare. His commitment to democracy has substance beyond the promotion of ideological agendas.
15,000 words for $8.00 certainly suggests Barsamian and South End press put a high price on what they do,
"In Porto Allegre, we smelt the tempest in the rooms of the forum and in
the street
we breather hope. A hope that clashes with the rhetoric of the debates
and the official positions.
The black bloc doesn't love all this rhetoric. For example a river of
words were spent on the poor countries and poverty.
The poverty they were taling about with such rhetoric is the poverty of
others, the pooverty which does not belong to themselves,
but the poverty of the poor.Each one of them who spoke about povert
meant to speak of others poverty
you could tell from a thousand miles that they thought themselves rich.
To stresss the reasons for others
poverty, to exalt your own wealth.Speaking of it with the benign idiocy
with which the rich give alms.
The process of discourse is one of the reasons for poverty. Whoever
hates poverty must begin by
destroying their own poverty, that poverty of relations, affections, of
a stupid existend that no
money can substitute for. This whole attachment to one's own position,
one's own group, one's own career, one's own country, one's own power
is this not the most pure and radical form of poverty?"
*******
"Only starting from one's own misery can one destroy poverty."
******
.......
A very strange book. Eager to set out a defining sensibility for the
black bloc,
immersed in its poetry of acts.Determined and unfaltering in rejecting
violence against people,
enthusiastic and unapolegetic in endorsing attacks against things,
property and especially images.
One had the feeling that breaking the spell had run its course, but the
success of this book suggests
otherwise. Elsewhere the author is keen to dismiss the controversey
over violence - and embracing
this term over that more neutral and contestational property damage -
with the simple claim that the
unorganised nature of it all renders such theoertical polemic
superfluous. The sourec of the dispute
is ascribed to the political struggle within the movement, and the fear
of criminalisation. The
former is dismissed as futile as the bb does not seek to establish any
form of hegemony and thus has
little reason to enter into a discopurse. The second is treated more
resignedly; the danger of the
emergence of organised armed groups is noted and indeed the existence
of the bb is posed as one of the
means by which such an evolution may be pre-empted.
******
Other odd aspects are the light touch extended to the self-appointed
leadership and the lack of criticism
of the ideological components vying for superiority within the circuits
of agitation.
*******
For a text that disclaims ideology, there is a strange dissonance with
the ideology it itself is the vehicle for.
The clear refutation of violence against individuals imputed to the bb
flies in the face of any
experience with the german autonomen or the area of autonomy in france.
Not to mention the bsque country.
Not that I have any problem with the expression of unabashed
subjectivity, nor in several cases with
the authors own view. The issue is the elevation of a personal
perspective to the status of rule,
and the failure to explain at least that these are contested claims.
There is a name for this, and it
is ideology.
********
Once more we encounter the unfortunate Seattle effect, that is the
premise that the WTO meeting
in December 1999 constitutes year zero for anti-capitalism and is the
appropriate platform from which to
construct all anew. Materially such scars are visible in the text in
that of the four ancilliary accounts
provided as forms of appendix two are the descriptions by Americans of
Genoa, one is by another US group (Crimethinc)
and one is french. Don't mistake this criticism for eurochauvinism, but
understanding the specifity of locality and territory
which is key to understanding the nature and properties of any given BB
requires a certain amount of native nous.
Other accounts by italians would have been more illuminating. There is
considerable irony in the
intense repudiation of the trademarks proliferated by the masss-media
and the prostration before
the mass medias rendering of the history and attributes of the bb
itself (viz. Seattle.) The absence of any accounts from Spanish,
German, Nordic and especially (for me personally) of eastern european
interpretations and voices
is a deep weakness of the book given their heavy numerical engagement
and particularly in the
latter instance the very different socio-economic context.
In New York we meet with minds and egos, private personaliites in an
exchange of egoboo, mutual flattering.
Occasionally we hurl our bodies into one another, but the greatest
intinacy there is the product of
a social alienation yearning to be healed. Our political meetings are
full of frayed, eurotic spasms,
exchanges of gossip and factoids, peacocks spreading theri wings
through the description of projects.
The apogee of social engineering is but friendship. No line of inquiry
is ever exhausted, only
gestured at with a flourish. To the loss of territoriality is added a
loss of metal stability,
a fragmentation and globalisation of the mind, held in thrall to the
relentless storm of
information which circulates. Loss of self-definition other that of
homo economicus and an incessant
parade of made to wear personlaity types.
|
|