Above are the slides from my July 20 presentation on crowdsourcing to the American Association of Law Libraries annual meeting. When I first suggested the title, I was sure the presentation would be a positive one, demonstrating the ways in which crowdsourcing and collaboration “are changing” legal research. I have long been a believer that crowdsourcing can help democratize legal research and enable free research sites to become more viable alternatives to paid sites.

But as I dug deeper into my research for the presentation, my long-held fears about crowdsourcing were increasingly confirmed. It just has not ever worked well within the legal profession. Over the years, site after site has attempted to make a go at crowdsourcing. But they almost always fail. Why is that?

I have a quote in one of my slides that may pretty well sum up the answer. It is from Apoorva Mehta, who is now a huge Silicon Valley success story as the founder of grocery-delivery service Instacart, but who, earlier in his career, attempted to start a legal networking and crowdsourcing site called Lawford (later called LegalReach). Asked later why Lawford failed, here is what he said:

I didn’t know anything about lawyers when we started. Turns out, they don’t like technology, and they don’t like to share things.

Anyone who is considering starting a crowdsourced law site should take Mehta’s quote, frame it and hang it above their desks.

That said — and perhaps I am ever the optimist — but I do believe there is hope. Three sites, in particular, stand out to me as potential success stories in the small world of crowdsourced legal research. I’ll get to those later in this post, but first let me recap some of the history as I presented it at AALL.

The Idea of Crowdsourcing

Thanks to Wikipedia, we all get the basic idea of crowdsourcing. A dictionary defines is this way:

The practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people and especially from the online community rather than from traditional employees or suppliers.

You could argue that a company such as West Publishing has been using crowdsourcing for decades. It relies on armies of lawyers to write the headnotes, assign the key numbers and create the editorial enhancements that add value to the raw cases it publishes.

Of course, West pays its editors, so that doesn’t count as crowdsourcing. But what if you could replicate those editorial enhancements through crowdsourcing? What if a free legal research site could post cases and other primary legal materials and ask its users to help enhance, explain and explicate those materials? That, my friends, is the aspirational ideal of crowdsourced legal research.

In the legal field, one of the earliest attempts at crowdsourcing was by Harvard (then Stanford) law professor Lawrence Lessig. Five years after his book, Code: And Other Laws Of Cyberspace, came out in 1999, it needed an update. So Lessig posted it to a wiki and invited the Internet community at large to edit it by adding their ideas, questions or new material. In January, 2006, Lessig took the product of that wiki-edit, and added his own edits to produce Code Version 2.0. Revision by crowdsourcing seemed to be a success.

But where Lessig appeared to succeed, many others have failed. The legal Web is haunted by the spirits of the many crowdsourced sites that have come and gone. Among them:

  • Spindle Law. Launched in 2010 (see my write-up), it was to be “a new kind of legal research and writing system” that would make legal research “faster and smarter.” Part of the way it planned to do this was by using crowdsourcing to build on the knowledge of its users. All registered users could add or edit authorities, edit its tree-like hierarchy, comment on authorities, and vouch for or reject authorities.
  • JurifyJurify. This site’s motto said it all: “No lawyer is smarter than all lawyers.” Launched in 2012 as the “first mass collaboration platform for lawyers and clients,” Jurify’s focus was on using crowdsourcing to enhance access to legal research. “Think of it as a Wikipedia for the law,” VentureBeat reported at the time. “By crowdsourcing the curation and information-gathering process, the startup plans to slash subscription fees for legal research.” Not long after it launched, it disappeared, only to re-emerge in January 2014 in a version that scaled way back on the original version but stuck with the crowdsourcing idea. It did not survive the year.
  • Standardforms.org. This was founded in 2011 (see my write-up) by an attorney who was a VP and legal counsel at Brown Brothers Harriman. His idea was to use crowdsourcing to come up with better legal forms. Anyone could post a form and anyone else could revise or comment on it. His hope was to use crowdsourcing to achieve a consensus of what should and should not be in legal agreements. It never took off.
  • Lawford. This was the site launched by the aforementioned Mehta in 2011. On my blog at the time, I said this about it: “Lawford’s developers have the ambitious goal of building the largest legal networking platform in the world. In fact, they say that they hope someday to have every lawyer in the world become a contributing part of the site.” They planned to partly populate the site with court opinions and legal articles and then have its users build discussions around them.

Besides these research sites, there are the corpses of so many legal networking sites that died primarily because of the lack of one essential ingredient: participation by lawyers. Among the fatalities:

Suspended Animation

Besides the sites that died, there are also a number of crowdsourced legal sites that remain online, but in what can only be described as states of suspended animation. There is, for example, the LawLibWik, a wiki for law librarians that was created in 2006 and last updated in 2007. Another is the Civil Law Dictionary, where the last update I could find was six years ago and where all the updates were made not by the crowd, but by a single person, the site’s founder.

MootusA site I am particularly disappointed to see becoming dormant is Mootus. When it launched in 2013, it struck me has having the potential to gain some traction. I wrote about it both here and in the ABA Journal. It took a different approach to crowdsourcing, describing itself not as a legal research site, but as a platform for “open online legal argument” designed for both law students and practicing lawyers.

The idea was that a user would post a legal issue to be “argued.” Other users would respond by posting cases they believed were relevant to the issue, together with their arguments for why a case applied. Still other users could then comment on a posted case and vote on whether a case was “On Point” or “Off Base.”

But with activity on the site seeming to have dwindled, I asked its co-founder Adam Ziegler about what had happened. Here was his response:

Our early experiments asked:

  1. Would lawyers post research questions to the crowd?
  2. Would the crowd post answers in the form of legal citations?
  3. Would other lawyers find the public Q&A thread useful/helpful?

We found yes to the first, no to the second, yes to the third.

On the second, we found “no” even for unemployed new lawyers sitting on the couch – pretty clear refutation of our hypothesis that we would find early contributors among the community of underemployed lawyers.

My takeaway from those results was that explicit, unpaid crowdsourcing, Q&A-style, isn’t a viable model right now. I’d love to see someone prove that wrong.

Note his conclusion: Not a viable model right now.

Another dormant site that seemed promising is Law Genius, whose motto is “Legal analysis for the crowd, by the crowd.” It is part of the larger Genius network of crowdsourced community sites, all of which grew out of the original site, Rap Genius, which was started in 2009 for the purpose of listing and annotating rap lyrics.

Rap Genius was so successful that soon, users started using the site to annotate all sorts of other stuff, from the collected works of Shakespeare to the roster of the 1986 New York Mets to the warnings on the back of a Tylenol bottle. A year ago, the site officially relaunched as Genius, becoming a hub for a range of communities devoted to topics such as rock, literature, history, sports, screen and tech. All are united by the site’s overarching goal, “to annotate the world.”

All of these Genius sites seemed to thrive. And then came Law Genius last November. It was an effort to crowdsource statutes, case law and other legal news. When it launched, I interviewed its editor, a Yale Law School graduate. She said:

There’s so much information lawyers have (particularly in our own little fields of expertise) and we have so much to say about what’s happening, though we usually keep those thoughts to ourselves, either writing emails to listservs or blogging in our small interconnected blogospheres.

I thought, wouldn’t it be great if those conversations happened publicly, around the text of actual opinions and statutes themselves? And before you know it, I came here to kickstart Law Genius.

Fast forward and there has been virtually no activity on Law Genius. The latest update shown on the home page as of today is the same update that was there when I wrote about the site in November. As best as I can tell, the editor I interviewed left not long after I heard from her.

Is There Hope?

So is there hope for crowdsourcing in legal research?

There are a handful of legal sites that achieved some success in using crowdsourcing to build content. Starting in 2005, the Electronic Frontier Foundation used crowdsourcing to create its Internet Law Treatise. But it has not been updated by anyone for more than two years and not substantially updated for more than five years. JurisPedia has used crowdsourcing to create an encyclopedia of world law with more than 1,600 articles. But it has not been updated by anyone since 2012. I don’t know about you, but I like my legal research materials to be a little more timely than that.

One wiki-like site that always impressed me for being fairly thorough and up-to-date was Judgepedia, an encyclopedia of courts and judges. It no longer exists as a freestanding site, having been absorbed earlier this year into Ballotpedia, an encyclopedia about American politics and elections. But its content is still there and still timely.

Curious about its secret for crowdsourcing success, I contacted its editor-in-chief, Geoff Pallay, and asked him about it:

How do we keep it up-to-date? That is the billion-dollar question, isn’t it. Nearly all of our contributions are from staff writers — close to 99%.

So Ballotpedia’s “secret” is that it is not really crowdsourced. Just like my reference to West earlier in this post, it pays its writers. And that’s not crowdsourcing.

Three That Could Be Contenders

Abandon all hope ye who read this post? Not so fast. There are three sites that appear to be achieving some success using crowdsourcing and that could provide models for others.

CasetextwriteThe first is Casetext, a site that I have written about frequently in recent years. It launched in 2013 as a free legal research site that would use crowdsourcing to annotate cases. Users would be able to add tags to cases to help organize them, add links to secondary sources that discuss the case, add annotations to the case, and reply to and comment on other users’ annotations.

That original concept was slow to get going, so Casetext looked for ways to get legal professionals more engaged in the site and in discussions around cases. Its first big step in this direction came last October, when it introduced new community pages. These are pages organized around practice areas and interests where lawyers can contribute analysis, meet others in their fields, and engage in discussions about current legal developments. These have become fairly popular, with some of its communities now having over 10,000 followers.

Then, in June, Casetext launched Legalpad, its publishing platform custom designed for people who write about the law. Users can use it to draft articles about the law. The articles get published to Casetext’s communities and also become part of the Casetext database of legal commentary. If an article discusses a case, the article links directly to the case and the case links back to the article.

It’s all about incentives, Casetext founder Jake Heller told me:

When we started out, we focused on annotations. The truth is, attorneys don’t write that way. The right people weren’t incentivized to write.

We created communities to give people real incentives to write on Casetext.

Now what we’re trying to do is make it easier. You don’t need a blog or WordPress. You don’t need to worry about SEO – we have 350,000 users every month. From day one, you can speak to a built in audience.

The big-picture goal is to match the incentives and interests of people who are excited to write about the law … with what we think will be a really powerful research experience. … We want to make these discussions into data, to overlay the social layer with the primary source documents.

And then there is the oldie-but-goodie Wex, the crowdsourced legal dictionary started in 2005 by the Legal Information Institute. Although it has been sort-of using a crowdsourced model since its start, earlier this year it started a more ambitious effort “to change fundamentally the nature of Wex into something that was more crowdsourced,” LII Associate Director Craig Newton told me.

As part of that effort, the LII — in a move similar to Casetext’s — built its own authorship platform within its content management system. The idea was to make it easy for its contributors to contribute and for its editors to edit. The LII recruited around 100 contributors to kick the tires on the new system and also developed a “how-to” guide.

“While the goal is ultimately true Wikipedia-style crowdsourcing,” Newton said, “we’re several steps away from that.”

Also similar to Casetext, the LII is focusing on how to incentivize contributors. One way they plan to do that is by highlighting contributors through author profiles. Another is to enable contributors to earn “karma points” and badges.

It remains my belief (founded more on instinct than data) that author attribution is a key piece. Not only is attribution a big incentive for folks to write for Wex, it is a good tool for the reader in order to evaluate the “trustworthiness” of what’s in the article.

Last but not least of my crowdsourcing success stories is CanLII Connects. A project of the Canadian Legal Information Institute, it was launched in April 2014 as a way to marry the case law CanLII houses with commentary from the legal community. To do this, it encourages lawyers, scholars and others who are competent in legal analysis to contribute commentary on cases or to post summaries of cases. (See my 2014 write-up.)

CanliiconnectsOnly registered members are allowed to post and only after they have been approved by CanLII Connects staff. Once membership is granted, any member has the ability to add content, comment on the content of other members, and up-vote content. The site also allows entities to register as a “publisher” and post content. A publisher can be any law firm, organization, group, business or school that is also a member of the legal community. Publishers can post content to CanLII Connects directly and also authorize affiliated individuals (such as members of a firm) to post under the publisher’s name.

The site also draws content from blogs and other publications. It does not scrape content directly from other sites, but it encourages authors to republish their content on CanLII Connects. In this way, the author can link his or her content directly to the ruling it discusses and make it discoverable by someone who is researching that ruling.

Take one look at the site and you can see for yourself that the formula is working. Not only has it developed a good flow of content, but the content is thoughtful and timely. Colin Lachance, who in April left his position as CEO of CanLII, told me earlier this year:

We are at the beginning of a virtuous circle of growth: increased integration with our primary law site means greater awareness of the commentary site, including greater awareness of who is contributing; when lawyers and law profs see work from their peers on the platform, they are motivated to join and contribute their own work; expanding content from an expanding roster of respected professionals drives greater usage which make the platform more complete and more attractive to existing and future contributors to keep the flow of content going; continual growth will prompt us to pursue deeper integration of commentary and primary law which, hopefully, keeps the circle moving.

In other words, momentum is needed. As a site begins to build involvement, that involvement yields further involvement. There needs to be a critical mass of participation before any crowdsourced site can take off.

What Is The Secret Sauce?

So is there a secret sauce for crowdsourcing success in the legal field? Judging by these last three sites, I would sum it up as follows:

  1. Make it easy to contribute.
  2. Make it rewarding to contribute.
  3. Make the content useful to others.
  4. Success will breed success.

I’d love to hear your thoughts on crowdsourcing. I continue to believe that it could someday change the face of legal research. The big legal publishers do not have a monopoly on smart lawyers with good ideas and insights. If we could tap into the knowledge of the legal community at large, we could build a library of free legal research materials that could rival any paid ones.


As I start to make my way through some of the news I picked up at LegalTech last week, here’s a big one: The start-up legal research site Casetext announced that it has raised a $7 million Series A financing round. The round is led by Union Square Ventures and includes participation by, among others, Thomas H. Glocer, the Yale Law School grad who is the former CEO of Thomson Reuters and, before that, Reuters Group PLC.

CasetextLogoThis investment adds to the $1.8 million in seed funding the company secured in October 2013, bringing its total funding to $8.8 million.

I’ve written about Casetext for the ABA Journal and multiple times on this blog. The site provides free access to cases and statutes for legal research and uses crowdsourcing — insights contributed by the legal community — to annotate the legal materials in its collection.

Its CEO Jake Heller is a former litigation associate at Ropes & Gray and law clerk to 1st U.S. Circuit Court of Appeals Judge Michael Boudin. At Stanford Law School, Heller was president of the Stanford Law Review and a managing editor of the Stanford Law & Policy Review.

“We’re taking a totally different approach to legal research,” Heller said in a press release. “The old way of doing things misses the most valuable source of legal knowledge: the legal community itself. Lawyers already share insight about the law publicly to demonstrate thought leadership and grow their reputation. By building the best platform to write commentary on the law, we’re able to collaborate with the legal community to create an insightful, free legal resource for lawyers and the public.”

Last October, I wrote here about Casetext’s launch of “communities” along with several additional new features. Casetext’s community pages are designed to provide common ground for lawyers who share interests and practice areas.

If you are not familiar with Casetext, I encourage you to check it out. For more background on it, read through some of my prior posts.

 

Genius

After my post Monday about Law Genius, a crowdsourcing site for posting and annotating legal documents, someone pointed me to this Betabeat piece from 2012 that provides further details on the site’s origins as Rap Genius, its funding from venture capital firm Andreessen Horowitz, and its transition from a site for annotating rap lyrics to one for annotating virtually anything.

The headline of the piece is that Andreessen Horowitz invested $15 million in the Genius site. Why would a firm that has invested in Facebook, Pinterest and Foursquare invest its money here? Well, co-founder Ben Horowitz tells Betabeat he considers this “one of the most important things we’ve ever funded.” In a post on Genius, Horowitz and partner Marc Andreessen offer this:

It turns out that Rap Genius has a much bigger idea and a much broader mission than that. Which is: Generalize out to many other categories of text… annotate the world… be the knowledge about the knowledge… create the Internet Talmud.

The Betabeat article also has information of particular interest to the legal world. For one, one of the site’s founders, Mahbod Moghadam, is a Stanford law school grad. In 2009, while on deferral from the law firm Dewey & LeBoeuf, he wrote a satirical memo about law firm billing that got him fired from the firm (and from an internship with Warren Buffett), freeing him to turn his attention to Rap Genius.

Moghadam has since been ousted from Genius, but it was he who saw the potential for bringing legal materials into the site, analogizing it to “footnotes on crack”:

“We are gonna do the dopest ads of all time,” Mr. Moghadam declared, but they have other potential revenue streams in mind as well. “Law firms will pay $100K a year for Law Genius Premium,” he insisted over email. “Lexis and Westlaw are jank—you go from one case to another, and it’s sloppy and wack. … Law Genius will be legal footnotes on crack! Also it can include anything—video, audio—instead of simply citing a Supreme Court case, you can embed the oral arguments!”

OK, I’ll admit I had to look-up “jank” in the Urban Dictionary, but even I got the point. It will be interesting to see whether Law Genius’s brand of crowdsourcing legal materials takes off.

LawGeniusCap

There is something very fitting in the fact that a site that started out deciphering rap lyrics is now turning its attention to making sense of the law.

The site, Law Genius, is the newest member of the larger Genius network of crowdsourced community sites, all of which grew out of the original site, Rap Genius, which was started in 2009 for the purpose of listing and annotating rap lyrics.

Soon, users started using the site to annotate all sorts of other stuff, from the collected works of Shakespeare to the roster of the 1986 New York Mets to the warnings on the back of a Tylenol bottle. Last July, the site officially relaunched as Genius, becoming a hub for a range of communities devoted to topics such as rock, literature, history, sports, screen and tech. All are united by the site’s overarching goal, “to annotate the world.”

Genius breaks down text with line-by-line annotations, added and edited by anyone in the world. It’s your interactive guide to human culture.

Now law is the latest addition to this ambitious effort at global annotation. It is an effort to crowdsource statutes, case law and other legal news. At the helm of the project, as executive editor of Law Genius, is Christine Clarke, a 2010 graduate of Yale Law School who practiced plaintiff-side employment law in Manhattan before joining Law Genius full time.

“There’s so much information lawyers have (particularly in our own little fields of expertise) and we have so much to say about what’s happening, though we usually keep those thoughts to ourselves, either writing emails to listservs or blogging in our small interconnected blogospheres,” Clarke said in an email. “I thought, wouldn’t it be great if those conversations happened publicly, around the text of actual opinions and statutes themselves? And before you know it, I came here to kickstart Law Genius.”

Any User Can Add Text

At Law Genius, any registered user can add text and annotate any text. Other users can vote up or down on annotations, or add their own suggestions to the annotations. As you view text, any portion that is highlighted has an annotation. Click on the highlighted text to view the annotation. To add your own annotation, just highlight a selection of text.

Any text on Law Genius can be embedded into any other web page, complete with annotations. Users can also share a document through social media or follow it to be notified of new annotations.

As examples of the types of annotations on Law Genius, Clarke point me to these:

There is also a section of Law Genius specifically for law students, where they can find classic cases such as Marbury v. Madison along with “Genius casebooks” on topics such as civil procedure.

Annotations on Law Genius are moderated by editors. Editors can accept, reject and edit annotations and help with training new users.

I’ve written about a number of sites for crowdsourcing the law, most recently CanLII Connects, Casetext and Mootus. Any of these sites face the same challenge — building a sufficiently critical mass of users to fuel contributions and discussions.

“Any text can be as layered, as allusive and cryptic, as worthy of careful exegesis as rap lyrics,” says the about Genius page. If you want layered, allusive and cryptic, look no further than legal text. Perhaps Law Genius can help make sense of the law.

 

CanLIIConnectsCap

Crowdsourcing the law is a concept any number of legal sites have tried over the years, as I’ve written about many times. The idea behind it makes perfect sense. There are lots of very smart legal professionals out there in the world — practitioners, academics, librarians and even law students. If they can be encouraged to share their knowledge and insights, we would all benefit from their collective input.

One of the most recent examples of this for U.S. law is Casetext, a site that provides free access to court opinions and then uses crowdsourcing to add descriptions and annotations to the cases. I’ve written about Casetext a number of times here and also in the ABA Journal.

This week, at the suggestion of Jordan Furlong, I’ve been exploring CanLII Connects, a site that does something similar for Canadian law, drawing on the legal community at large — as well as on blogs and publications — to provide commentary on and analysis of Canadian court decisions.

CanLII Connects is a project of CanLII — the Canadian Legal Information Institute. Much like its U.S. counterpart, the Legal Information Institute at Cornell Law School, CanLII is an organization devoted to providing free access to law and legal information and to providing the legal community with a free, comprehensive and robust legal research service.

CanLII now houses more than 275 databases of caselaw and legislative materials from the federal government, the provinces and the territories. Its collection includes more than 1.3 million documents with new documents being added at the rate of 2,000 a week. CanLII is staffed by Colin Lachance, its president and CEO as well as a 2014 ABA Journal Legal Rebel, and Sarah Sutherland, its manager of content and partnerships.

Contributors are Vetted

Last April, CanLII launched CanLII Connects as a way to marry the caselaw it houses with commentary from the legal community. To do this, it encourages lawyers, scholars and others who are competent in legal analysis to contribute commentary on cases or to post summaries of cases.

Only registered members are allowed to post and only after they have been approved by CanLII Connects staff. Here’s how they describe it:

CanLII Connects is committed to ensuring high-quality information for its users. For this reason, applications will be reviewed to determine whether you have the necessary competency to contribute summaries and commentary on Canadian case law. However, members are not necessarily required to be practicing lawyers or legal professionals.

Lachance, in an email, tells me that all membership applications are manually verified and participation is authorized only for people presumed capable of commenting on Canadian law. “Once membership is granted, any member has the ability to add content, comment on the content of other members, and upvote content,” he writes.

The site also allows entities to register as a “publisher” and post content. A publisher can be any law firm, organization, group, business or school that is also a member of the legal community. Publishers can post content to CanLII Connects directly and also authorize affiliated individuals (such as members of a firm) to post under the publisher’s name. A publisher can manage all content published under its name and by its affiliated members.

Contributors can post either summaries or commentary. Here is how the site explains the difference between them:

A summary is a shortened version of the full court decision. It typically contains the key facts, reasoning of the case, and the court’s ruling. A summary should not include any commentary or opinion about the case; it’s our preference that contributors keep these documents as neutral as possible. Commentary, on the other hand, might include a summary of the case but will go generally further by offering an analysis or new idea related to the case or the legal topics at issue.

When an item is posted as commentary, other registered users can add their own comments to the commentary and create a discussion.

The site also aims to draw content from blogs and other publications. It does not scrape content directly from other sites, but it encourages authors to republish their content on CanLII Connects. In this way, the author can link his or her content directly to the ruling it discusses and make it discoverable by someone who is researching that ruling.

One example of how this works is from the blog ABlawg.ca, a blog on developments in Alberta law written by faculty members at The University of Calgary. When CanLII Connects launched, the authors of this blog joined as a publisher contributed their back catalog of relevant posts and continue to add their new posts to the site, typically within hours of posting them on their own site.

Another example of an academic blog contributing content to CanLII Connects is The Court, an Osgoode Hall Law School blog devoted to the Supreme Court of Canada.

Future Development

Lachance’s email to me includes a paragraph that perfectly describes the ideal of how crowdsourcing can work within the legal profession:

We are at the beginning of a virtuous circle of growth: increased integration with our primary law site means greater awareness of the commentary site, including greater awareness of who is contributing; when lawyers and law profs see work from their peers on the platform, they are motivated to join and contribute their own work; expanding content from an expanding roster of respected professionals drives greater usage which make the platform more complete and more attractive to existing and future contributors to keep the flow of content going; continual growth will prompt us to pursue deeper integration of commentary and primary law which, hopefully, keeps the circle moving.

Perhaps another way to put it is that success breeds success. Of the crowdsourcing sites I’ve covered over the years, several have failed. Their failure has stemmed from their inability to ever build a core collection of contributions. As Lachance suggests, once you can begin to build that core, then the core fuels further contributions, almost virally. From what I’ve seen of CanLII Connects, it looks like they’ve built a solid core and are well on their way to becoming a vital site for Canadian legal research.

Lachance would like to see the site develop beyond only case summaries and commentaries. “In building the legitimacy of the platform and process for sharing of short-form content, we build a community of potential contributors of longer-form content,” he says. That means that CanLII could become attractive to authors as a publisher of longer-form texts, manuals and treatises on a par with those you can find on Westlaw or LexisNexis.

That would take CanLII a long way towards its goal of providing the Canadian legal community with a comprehensive and robust research service.

CasetextEnvironmental
Environmental Law is one of Casetext’s new communities.

It was one year ago that I first wrote here about Casetext, the free legal research site that uses “crowdsourcing” to annotate court opinions. More recently, I wrote about Casetext’s addition of a citator, called WeCite. Now, there is more Casetext news to report.

Casetext is preparing to launch a new version of its research platform that will add communities and other social features. The new features have already been rolled out in a beta version. The text version came out of private beta last week and is now in public beta at beta.casetext.com. Continue Reading Casetext Prepares to Add ‘Communities’

This is a summer of legal hackathons here in the Boston area. As I’ve previously mentioned here, The  the ABA Journal and Suffolk University Law School will be cosponsoring a hackathon around the theme of access to justice in conjunction with the ABA annual meeting in Boston. Hackcess to Justice, as it is being called, will be held Aug. 7-8, with $3,000 in prize money going to the top three hacks. (I’m thrilled to add that I will be one of the three judges for the event.) Read more about it here and here

Meanwhile, the first-ever MIT Legal Hackathon kicks off tomorrow and runs through Sunday, and you do not need to be anywhere near Boston to participate, since the entire event takes place online. The event announcement describes its purpose:

The goal of the event is to bring together people to collaborate on solving legal and technical issues and challenges as law and business become fully digital.  Software developers, business people, academics, government employees, advocates and others.  Participants will have the opportunity to offer or join sessions to collaborate on “hacking the law” by developing computer and legal projects.

Included over the course of the event will be four “innovation challenges” that will focus on coming up with ways to better address certain issues relating to legal information:

  • Clio challenge. Clio will sponsor a  challenge that will pull together participants from the business, legal and computer communities to look at a couple of different issues and develop approaches to solving the needs of all three communities.
  • Annotation challenge. Sponsored by Casetext (which I reviewed here), this challenge focuses on conceiving of and implementing ways to use and improve the Casetext platform, which uses crowdsourcing to annotate primary legal materials.
  • Effective legal language challenge. Andrew Perlman, director of the Institute on Law Practice Technology & Innovation at Suffolk Law School, and Gabe Teninbaum, professor of legal writing at Suffolk Law, will lead this challenge, which will look at developing a two-step crowdsourcing platform for the drafting of more effective legal language.
  • Controlling and exercising data rights. No description has been posted yet of this challenge.

In addition to these challenges, there is a full program of related sessions, as well as the opportunity for participants to create their own sessions. See the full program here and a list of the speakers here. Registration is free. Follow developments through the event blog.

WeCite entries for the Supreme Court case Granholm v. Heald.
WeCite entries for the Supreme Court case Granholm v. Heald.

I’ve written both here and for the ABA Journal about Casetext, a free legal research platform that uses crowdsourcing to add annotations and descriptions to cases. As a matter of fact, I’ll be talking about Casetext at ABA Techshow tomorrow as part of a presentation on using crowdsourcing in legal research.

Today, Casetext is officially announcing its version of a citator (think Shepard’s or KeyCite) to help users understand a case’s subsequent history. It is called WeCite and it has been developed in conjunction with the Stanford Center of Legal Informatics, which will make the data gathered through the effort publicly available.< !––nextpage––>

Now, when you view a case in Casetext, WeCite appears as an option in the left ribbon. Click it and you will then see a list of all judicial opinions that cite that case and brief descriptions of each opinion. Casetext automatically creates the list of opinions that cite to the case, but users can add their own citation references and analysis. And just as users can vote up or down on annotations in Casetext, they can vote up or down on WeCite entries, causing the entries to move up or down the list.

While WeCite is not comparable to Shepard’s or KeyCite in its thoroughness, capabilities or analysis, it is a useful addition to this free legal research site.

The heatmap runs down the right side of the case.
The heatmap runs down the right side of the case.

Another new feature in Casetext is the “heatmap” that runs alongside each case as you view it on your screen. The heatmap shows how frequently each page in a case has been cited — the darker the color red on the map, the more it has been cited.

Jake Heller, Casetext CEO, says there are more developments in the works for his company, which was launched just last summer. So stay tuned here for updates.

JurifyHome

In October 2012, two longtime corporate lawyers announced the private beta launch of Jurify, which they described as the “first mass collaboration platform for lawyers and clients.” The site would focus on using crowdsourcing to enhance access to legal research. “Think of it as a Wikipedia for the law,” VentureBeat reported at the time. “By crowdsourcing the curation and information-gathering process, the startup plans to slash subscription fees for legal research.”

It was not long, however, before Jurify disappeared. The site never came out of private beta. Instead, its founders went back to the drawing board, realizing that they had promised more than they could deliver. “We fell victim to one critical mistake of first-time entrepreneurs,” cofounder Erik Lopez told me this week. “Our aspirations exceeded what we could technically achieve with our first iteration.”

Now Jurify is back. It is relaunching this week — perhaps as soon as today — with a pared-back site that will be home to a free legal-research platform that Lopez describes as unprecedented for its sophistication and functionality.

To begin, Jurify will limit its coverage to legal research materials for transactional lawyers, providing resources for corporate, securities and M&A lawyers. If that proves successful, the company will begin to build out other practice areas, beginning with other complementary areas such as secured lending and executive compensation, with the goal of eventually covering every practice area.

“By growing vertical by vertical, we’re able to deliver a highly customized interface,” Lopez said. “This enables us to build a much smarter search functionality that sees what you’re looking for and knows how to deliver it to you.”

A Focus on Tags

I have not yet had a chance to use Jurify, but Lopez gave me a demonstration earlier this week. I was impressed with what I saw.

Jurify does not create legal research material, it curates it. Generally speaking, you will not find anything here that you could not find elsewhere online. But Jurify helps you zero in on the materials that are pertinent to your search — and, Lopez says, to the most-credible or most-authoritative of those materials.

A key way it does this is through the use of tags. Jurify has created 586 tags that are tied to specific issues in corporate law. The site’s default method of search is by these tags. They allow for three levels of nesting, with the idea that you can “find what you want in three clicks.” To start a search, you might pick the tag “Governance.” From there, you select a subcategory, such as “Audits & Accounting” or “Board Committees,” and then from there a further subcategory.

Alternatively, you can start a search using the site’s glossary, which allows you to browse all the available tags, or simply search by keyword. However you search, Lopez says that the site’s use of subject-specific tagging will enable you to get far more precise results than you ever could using a generic search site and even more precise results than you would through other legal-research sites. If you are ever uncertain about the meaning of a tag, hover over it to see its definition.

Once you get your search results, Jurify provides a number of ways to filter the results. You can filter the results by keywords, tags, jurisdiction, and other facets. You can also filter by content type.

Jurify draws its content from a range of web sources. In addition to cases, statutes and regulations, content includes blog posts, journal articles, law firm client alerts, news articles, forms and training materials.

Crowdsourcing Content

Jurify retains elements of its original plan to tap into crowdsourcing. Users are encouraged to add content of their own to the site – blog posts, articles, forms or whatever. The contributor of a resource will be identified on the resource page. Eventually, Jurify will reward contributors with greater prominence on the site.

All three of the company’s founders have worked as corporate lawyers both in-house and at major law firms. Their belief is that corporate counsel will be early adopters of this site. Their presence will serve as incentive for private-firm lawyers to contribute content and raise their own profiles in the site.

Of course, you need to get the lawyers to come to your site in the first place, Lopez acknowledges. “You need to give users a reason to come to your site. That’s what we’ve tried to do, by giving them answers far faster than anyone else can.”

Free for Research

There will be no cost to register for Jurify or to conduct research. And Lopez says it will stay that way. “All the research is free and will always be free,” he says. The site will generate revenue by offering a premium membership.

Premium members will be allowed unlimited downloads of legal forms and precedent. At launch, the site will have 1,000 forms, all prepared and formatted by Jurify. Non-paying members will be limited to 25 downloads.

As of my conversation with Lopez, pricing for premium membership was not final but he expected to offer the first 1,000 premium registrants a price of around $350 a year.

“We launched a completely different website last year, with a smaller database and different technology,” Lopez says. “We learned a lot from that, including that it wasn’t good enough. We knew we needed a smarter keyword search and a larger database.”

From what I saw during Lopez’s demonstration, going back to the drawing board was the right move for Jurify.