Software which adds value to books

DiscussãoBooks in 2025: The Future of the Book World

Aderi ao LibraryThing para poder publicar.

Software which adds value to books

Este tópico está presentemente marcado como "adormecido"—a última mensagem tem mais de 90 dias. Pode acordar o tópico publicando uma resposta.

Nov 18, 2010, 10:10 am

One thing I've barely seen mentioned in discussions concerning the future of books is using electronic books in study and research. In my field (Biblical Studies), electronic access to books has made an incredible difference. This five minute speech at O'Reilly's "Tools of Change" Conference from the CEO of the software I use (they re-publish over 10,000 books electronically) I think does a great job of explaining why Kindle isn't the end of e-books, and why e-books aren't the end of publishing.

Nov 18, 2010, 11:12 am

Yes, I think you're right. There's a sort of collective tunnel vision on LibraryThing that can seduce you into a worldview where "book" = "popular novel read for pleasure".

I suspect that a pretty large proportion of LT members, if they thought about it, would realise that they are already using what are in effect bigger, better, more versatile ebooks in their daily work. On the average ebook reader you struggle to do a simple text search or a "go to page no.", but when I open an article from a scientific journal on-line, I'd think it rather strange and primitive if it wasn't possible to click directly on the references in the bibliography to call up the earlier work the authors are citing, for instance: not all that long ago I'd have had to spend a few hours wandering round the stacks and queueing up for the photocopier to do that, or ordered the articles from the British Library if it wasn't a journal we had in house.

In those days you'd track down the reference only to find a wooden block on the shelf with a note to tell you that "this volume is at the bindery" - nowadays you get a message "this article is not available under your current subscription package" - plus ça change...

Nov 18, 2010, 12:46 pm

It has certainly been startling to me, as someone returning to academia after 18 years away how much the approach to finding research materials has changed in the intervening period with the advent of databases full of scholarly articles I'm able to access from the comfort of my own home instead of fighting over the one copy the library holds of the journal concerned with my fellow students. However, whilst the form of access has changed, the actual content hasn't.

What should change if the primary form of consumption for these texts is electronic is how the articles are written. At a simple level, authors should be thinking about the hyperlinks they embed in their text, but e-publishing provides a multitude of multi-media possibilities as well. Articles could be treated as wikis, worked on in the Cloud etc. There's a lot of possibility and we're only at the beginning of the journey.

Where I think e-books in general will make a difference is in the publishing "tail". The 80/20 rule says 20% of published texts will account for 80% of all purchases and library loans. That 80% probably includes virtually all works of serious scholarship, that at the very least will become print-on-demand, and more likely totally virtual.

It is also true that more books are being published now than at any time in history. There are issues such as copyright to consider in the transfer to digital, but nothing insurmountable. The real problem, as thorold points out, is making these things available. Academic journal databases are expensive to access and among the first things college libraries look at cutting when budgets are tight.

Editado: Nov 19, 2010, 9:48 am

Articles could be treated as wikis, worked on in the Cloud etc.

Yes, but...! We still live in a world where the principles of scholarship, as well as some very important legal considerations, rely on the notion that there is a well-defined publication event(*), after which the text of an article is immutable, and before which its content can be considered not to be known. Things get very messy if you have to rely on the history file of a wiki being preserved for posterity as well as the text of the article itself.

ETA: (*) and one or more clearly identified authors, who take responsibility for the content

Nov 21, 2010, 4:41 pm

>1 markbarnes: Thanks for the link, extremely interesting.
The discussion there was about analyzing the existing published scholarly content for a particular domain and using that analysis to help end users get to relevant information. Undoubtedly a significant amount of work that in effect generated new content.
Who else is working on similar approaches and in which domains?
What is the risk of interpretation bias in this type of approach?

Nov 21, 2010, 5:52 pm

> Who else is working on similar approaches and in which domains?

I think the Biblical studies domain is pretty unique in that it is being pursued enthusiastically at every level from post-doctoral research down to Mom and Pop. In other words it can get both the site-wide institutional sales at many thousands of dollars, plus a home market of hundreds of thousands of hundred dollar sales.

> What is the risk of interpretation bias in this type of approach?

It is a risk. At Logos the risk is minimised by working with a wide variety of publishers and editors. For example, one database being produced is a syntax database, which charts the grammatical relationships of every word in the Greek New Testament (picture below). It's open to significant editorial interpretation. But now there are three such databases, produced by entirely independent teams, working with fairly different methodologies. The product ships with all three databases and the user can choose which he wants, and indeed compare the approaches. There are even five independently produced databases which parse every Greek work used in the New Testament (which is open to interpretation only about 1% of the time).

This hasn't always been the case, but as the product as matured, and the value of such databases is more clearly seen, then more and more have been added. Now, the vast majority of the databases included will come in multiple versions.

Dez 1, 2010, 3:49 pm

The two Logos led sessions at TOC 2010, were by far my favorites this year. They *get* it. I've been following along ever since.

My biggest fear, as we start to digitize everything and anything we can, is that no one is paying attention to "versioning". That is as Biblical interpretations shift, legal definitions, histories, maps, etc. get edited, uploaded (thus writing over previous edition) and pushed back out to get indexed and searched, no one is cacheing the changes.

Pick any historically hot topic: marriage, AIDS, welfare, citizenship. If I Google the legal definition of a U.S. citizen, I'll get an answer. Probably the most recent (and hopefully correct) answer. But what if I wanted to see when that changed... how does my search's #1 return compare to the #1 return from 1960... or 1860...
Who keeps up with all of those changes, if everything is being moved to editable and re-flowable formats?

There is something to be said for preserving copies of information in non-editable/read-only versions. I'm hoping it's something that people who build software to help with research and access take into account: the comparative value of previous versions.

Jan 27, 2011, 10:32 am

Nicholas Carr includes a chapter about ebooks and their potential future in his book, The Shallows. Anyone interested in a group read? His book pulls all sorts of information together to help to look at the near future of culture as it is being affected by digital technologies.