The Failure of Crowdsourcing in Law (So Far, At Least)

Above are the slides from my July 20 presentation on crowdsourcing to the American Association of Law Libraries annual meeting. When I first suggested the title, I was sure the presentation would be a positive one, demonstrating the ways in which crowdsourcing and collaboration “are changing” legal research. I have long been a believer that crowdsourcing can help democratize legal research and enable free research sites to become more viable alternatives to paid sites.

But as I dug deeper into my research for the presentation, my long-held fears about crowdsourcing were increasingly confirmed. It just has not ever worked well within the legal profession. Over the years, site after site has attempted to make a go at crowdsourcing. But they almost always fail. Why is that?

I have a quote in one of my slides that may pretty well sum up the answer. It is from Apoorva Mehta, who is now a huge Silicon Valley success story as the founder of grocery-delivery service Instacart, but who, earlier in his career, attempted to start a legal networking and crowdsourcing site called Lawford (later called LegalReach). Asked later why Lawford failed, here is what he said:

I didn’t know anything about lawyers when we started. Turns out, they don’t like technology, and they don’t like to share things.

Anyone who is considering starting a crowdsourced law site should take Mehta’s quote, frame it and hang it above their desks.

That said — and perhaps I am ever the optimist — but I do believe there is hope. Three sites, in particular, stand out to me as potential success stories in the small world of crowdsourced legal research. I’ll get to those later in this post, but first let me recap some of the history as I presented it at AALL.

The Idea of Crowdsourcing

Thanks to Wikipedia, we all get the basic idea of crowdsourcing. A dictionary defines is this way:

The practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people and especially from the online community rather than from traditional employees or suppliers.

You could argue that a company such as West Publishing has been using crowdsourcing for decades. It relies on armies of lawyers to write the headnotes, assign the key numbers and create the editorial enhancements that add value to the raw cases it publishes.

Of course, West pays its editors, so that doesn’t count as crowdsourcing. But what if you could replicate those editorial enhancements through crowdsourcing? What if a free legal research site could post cases and other primary legal materials and ask its users to help enhance, explain and explicate those materials? That, my friends, is the aspirational ideal of crowdsourced legal research.

In the legal field, one of the earliest attempts at crowdsourcing was by Harvard (then Stanford) law professor Lawrence Lessig. Five years after his book, Code: And Other Laws Of Cyberspace, came out in 1999, it needed an update. So Lessig posted it to a wiki and invited the Internet community at large to edit it by adding their ideas, questions or new material. In January, 2006, Lessig took the product of that wiki-edit, and added his own edits to produce Code Version 2.0. Revision by crowdsourcing seemed to be a success.

But where Lessig appeared to succeed, many others have failed. The legal Web is haunted by the spirits of the many crowdsourced sites that have come and gone. Among them:

  • Spindle Law. Launched in 2010 (see my write-up), it was to be “a new kind of legal research and writing system” that would make legal research “faster and smarter.” Part of the way it planned to do this was by using crowdsourcing to build on the knowledge of its users. All registered users could add or edit authorities, edit its tree-like hierarchy, comment on authorities, and vouch for or reject authorities.
  • JurifyJurify. This site’s motto said it all: “No lawyer is smarter than all lawyers.” Launched in 2012 as the “first mass collaboration platform for lawyers and clients,” Jurify’s focus was on using crowdsourcing to enhance access to legal research. “Think of it as a Wikipedia for the law,” VentureBeat reported at the time. “By crowdsourcing the curation and information-gathering process, the startup plans to slash subscription fees for legal research.” Not long after it launched, it disappeared, only to re-emerge in January 2014 in a version that scaled way back on the original version but stuck with the crowdsourcing idea. It did not survive the year.
  • This was founded in 2011 (see my write-up) by an attorney who was a VP and legal counsel at Brown Brothers Harriman. His idea was to use crowdsourcing to come up with better legal forms. Anyone could post a form and anyone else could revise or comment on it. His hope was to use crowdsourcing to achieve a consensus of what should and should not be in legal agreements. It never took off.
  • Lawford. This was the site launched by the aforementioned Mehta in 2011. On my blog at the time, I said this about it: “Lawford’s developers have the ambitious goal of building the largest legal networking platform in the world. In fact, they say that they hope someday to have every lawyer in the world become a contributing part of the site.” They planned to partly populate the site with court opinions and legal articles and then have its users build discussions around them.

Besides these research sites, there are the corpses of so many legal networking sites that died primarily because of the lack of one essential ingredient: participation by lawyers. Among the fatalities:

Suspended Animation

Besides the sites that died, there are also a number of crowdsourced legal sites that remain online, but in what can only be described as states of suspended animation. There is, for example, the LawLibWik, a wiki for law librarians that was created in 2006 and last updated in 2007. Another is the Civil Law Dictionary, where the last update I could find was six years ago and where all the updates were made not by the crowd, but by a single person, the site’s founder.

MootusA site I am particularly disappointed to see becoming dormant is Mootus. When it launched in 2013, it struck me has having the potential to gain some traction. I wrote about it both here and in the ABA Journal. It took a different approach to crowdsourcing, describing itself not as a legal research site, but as a platform for “open online legal argument” designed for both law students and practicing lawyers.

The idea was that a user would post a legal issue to be “argued.” Other users would respond by posting cases they believed were relevant to the issue, together with their arguments for why a case applied. Still other users could then comment on a posted case and vote on whether a case was “On Point” or “Off Base.”

But with activity on the site seeming to have dwindled, I asked its co-founder Adam Ziegler about what had happened. Here was his response:

Our early experiments asked:

  1. Would lawyers post research questions to the crowd?
  2. Would the crowd post answers in the form of legal citations?
  3. Would other lawyers find the public Q&A thread useful/helpful?

We found yes to the first, no to the second, yes to the third.

On the second, we found “no” even for unemployed new lawyers sitting on the couch – pretty clear refutation of our hypothesis that we would find early contributors among the community of underemployed lawyers.

My takeaway from those results was that explicit, unpaid crowdsourcing, Q&A-style, isn’t a viable model right now. I’d love to see someone prove that wrong.

Note his conclusion: Not a viable model right now.

Another dormant site that seemed promising is Law Genius, whose motto is “Legal analysis for the crowd, by the crowd.” It is part of the larger Genius network of crowdsourced community sites, all of which grew out of the original site, Rap Genius, which was started in 2009 for the purpose of listing and annotating rap lyrics.

Rap Genius was so successful that soon, users started using the site to annotate all sorts of other stuff, from the collected works of Shakespeare to the roster of the 1986 New York Mets to the warnings on the back of a Tylenol bottle. A year ago, the site officially relaunched as Genius, becoming a hub for a range of communities devoted to topics such as rock, literature, history, sports, screen and tech. All are united by the site’s overarching goal, “to annotate the world.”

All of these Genius sites seemed to thrive. And then came Law Genius last November. It was an effort to crowdsource statutes, case law and other legal news. When it launched, I interviewed its editor, a Yale Law School graduate. She said:

There’s so much information lawyers have (particularly in our own little fields of expertise) and we have so much to say about what’s happening, though we usually keep those thoughts to ourselves, either writing emails to listservs or blogging in our small interconnected blogospheres.

I thought, wouldn’t it be great if those conversations happened publicly, around the text of actual opinions and statutes themselves? And before you know it, I came here to kickstart Law Genius.

Fast forward and there has been virtually no activity on Law Genius. The latest update shown on the home page as of today is the same update that was there when I wrote about the site in November. As best as I can tell, the editor I interviewed left not long after I heard from her.

Is There Hope?

So is there hope for crowdsourcing in legal research?

There are a handful of legal sites that achieved some success in using crowdsourcing to build content. Starting in 2005, the Electronic Frontier Foundation used crowdsourcing to create its Internet Law Treatise. But it has not been updated by anyone for more than two years and not substantially updated for more than five years. JurisPedia has used crowdsourcing to create an encyclopedia of world law with more than 1,600 articles. But it has not been updated by anyone since 2012. I don’t know about you, but I like my legal research materials to be a little more timely than that.

One wiki-like site that always impressed me for being fairly thorough and up-to-date was Judgepedia, an encyclopedia of courts and judges. It no longer exists as a freestanding site, having been absorbed earlier this year into Ballotpedia, an encyclopedia about American politics and elections. But its content is still there and still timely.

Curious about its secret for crowdsourcing success, I contacted its editor-in-chief, Geoff Pallay, and asked him about it:

How do we keep it up-to-date? That is the billion-dollar question, isn’t it. Nearly all of our contributions are from staff writers — close to 99%.

So Ballotpedia’s “secret” is that it is not really crowdsourced. Just like my reference to West earlier in this post, it pays its writers. And that’s not crowdsourcing.

Three That Could Be Contenders

Abandon all hope ye who read this post? Not so fast. There are three sites that appear to be achieving some success using crowdsourcing and that could provide models for others.

CasetextwriteThe first is Casetext, a site that I have written about frequently in recent years. It launched in 2013 as a free legal research site that would use crowdsourcing to annotate cases. Users would be able to add tags to cases to help organize them, add links to secondary sources that discuss the case, add annotations to the case, and reply to and comment on other users’ annotations.

That original concept was slow to get going, so Casetext looked for ways to get legal professionals more engaged in the site and in discussions around cases. Its first big step in this direction came last October, when it introduced new community pages. These are pages organized around practice areas and interests where lawyers can contribute analysis, meet others in their fields, and engage in discussions about current legal developments. These have become fairly popular, with some of its communities now having over 10,000 followers.

Then, in June, Casetext launched Legalpad, its publishing platform custom designed for people who write about the law. Users can use it to draft articles about the law. The articles get published to Casetext’s communities and also become part of the Casetext database of legal commentary. If an article discusses a case, the article links directly to the case and the case links back to the article.

It’s all about incentives, Casetext founder Jake Heller told me:

When we started out, we focused on annotations. The truth is, attorneys don’t write that way. The right people weren’t incentivized to write.

We created communities to give people real incentives to write on Casetext.

Now what we’re trying to do is make it easier. You don’t need a blog or WordPress. You don’t need to worry about SEO – we have 350,000 users every month. From day one, you can speak to a built in audience.

The big-picture goal is to match the incentives and interests of people who are excited to write about the law … with what we think will be a really powerful research experience. … We want to make these discussions into data, to overlay the social layer with the primary source documents.

And then there is the oldie-but-goodie Wex, the crowdsourced legal dictionary started in 2005 by the Legal Information Institute. Although it has been sort-of using a crowdsourced model since its start, earlier this year it started a more ambitious effort “to change fundamentally the nature of Wex into something that was more crowdsourced,” LII Associate Director Craig Newton told me.

As part of that effort, the LII — in a move similar to Casetext’s — built its own authorship platform within its content management system. The idea was to make it easy for its contributors to contribute and for its editors to edit. The LII recruited around 100 contributors to kick the tires on the new system and also developed a “how-to” guide.

“While the goal is ultimately true Wikipedia-style crowdsourcing,” Newton said, “we’re several steps away from that.”

Also similar to Casetext, the LII is focusing on how to incentivize contributors. One way they plan to do that is by highlighting contributors through author profiles. Another is to enable contributors to earn “karma points” and badges.

It remains my belief (founded more on instinct than data) that author attribution is a key piece. Not only is attribution a big incentive for folks to write for Wex, it is a good tool for the reader in order to evaluate the “trustworthiness” of what’s in the article.

Last but not least of my crowdsourcing success stories is CanLII Connects. A project of the Canadian Legal Information Institute, it was launched in April 2014 as a way to marry the case law CanLII houses with commentary from the legal community. To do this, it encourages lawyers, scholars and others who are competent in legal analysis to contribute commentary on cases or to post summaries of cases. (See my 2014 write-up.)

CanliiconnectsOnly registered members are allowed to post and only after they have been approved by CanLII Connects staff. Once membership is granted, any member has the ability to add content, comment on the content of other members, and up-vote content. The site also allows entities to register as a “publisher” and post content. A publisher can be any law firm, organization, group, business or school that is also a member of the legal community. Publishers can post content to CanLII Connects directly and also authorize affiliated individuals (such as members of a firm) to post under the publisher’s name.

The site also draws content from blogs and other publications. It does not scrape content directly from other sites, but it encourages authors to republish their content on CanLII Connects. In this way, the author can link his or her content directly to the ruling it discusses and make it discoverable by someone who is researching that ruling.

Take one look at the site and you can see for yourself that the formula is working. Not only has it developed a good flow of content, but the content is thoughtful and timely. Colin Lachance, who in April left his position as CEO of CanLII, told me earlier this year:

We are at the beginning of a virtuous circle of growth: increased integration with our primary law site means greater awareness of the commentary site, including greater awareness of who is contributing; when lawyers and law profs see work from their peers on the platform, they are motivated to join and contribute their own work; expanding content from an expanding roster of respected professionals drives greater usage which make the platform more complete and more attractive to existing and future contributors to keep the flow of content going; continual growth will prompt us to pursue deeper integration of commentary and primary law which, hopefully, keeps the circle moving.

In other words, momentum is needed. As a site begins to build involvement, that involvement yields further involvement. There needs to be a critical mass of participation before any crowdsourced site can take off.

What Is The Secret Sauce?

So is there a secret sauce for crowdsourcing success in the legal field? Judging by these last three sites, I would sum it up as follows:

  1. Make it easy to contribute.
  2. Make it rewarding to contribute.
  3. Make the content useful to others.
  4. Success will breed success.

I’d love to hear your thoughts on crowdsourcing. I continue to believe that it could someday change the face of legal research. The big legal publishers do not have a monopoly on smart lawyers with good ideas and insights. If we could tap into the knowledge of the legal community at large, we could build a library of free legal research materials that could rival any paid ones.

Posted in:
  • Crowdsourcing legal research isn’t working for one reason: lawyers have lives. We’re busy, either with work or with our personal lives.

    When we’re working, our focus is working on matters for our clients. Writing as a marketing tool is great – I do plenty of it myself. But why write on a third-party site when you can get more bang for your buck by posting relevant (and therefore keyword-rich) content on your own website/blog? And, if you’re writing for a third party, it’s a safer bet to write for an established publication (or website) than on most of the crowdsourcing platforms mentioned in your article (Casetext may be the exception, since it’s been able to reach critical mass and therefore deliver eyeballs). Even better, write for a variety of established publications (or websites), which will increase your reach and will also garner you valuable backlinks from a variety of sites with Google juice (is that still a thing?) to your website.

    When we’re busy with our personal lives, we’re not thinking about law: we’re pursuing outside interests. Look at the successful Genius topics you mentioned: rock, literature, history, sports, screen and tech. All but literature and history surely draw from legions of passionate fans/hobbyists.

    • Lisa – The goal of some crowdsourced sites has been to help lawyers save time, not spend more of it, but helping them more quickly find relevant and authoritative research.

  • Bob,

    Your blog is on Justia, but you don’t mention that Justia has long-allowed attorney annotations to case law, a form of crowd-sourcing, or do I not understand something. Fine article.

    • I was not aware that Justia had started allowing annotations. You’re right — in concept, it is a form of crowdsourcing. However, as I look through the site, it appears that very few cases have annotations of any kind and, of those that do (mostly SCOTUS cases), the annotations are written by Justia staff writers.

  • Sure, but to find relevant and authoritative research, some author or authority (lawyers) has to spend time research and writing it.

  • Fair enough, but the content has to be created before it can be found. Moreover, how many lawyers are actually using crowdsourced platforms (let’s focus on Casetext for the US and CanLII for Canada) in their work for clients? In other words, how many of Casetext’s 350,000 monthly users are using Casetext in a manner similar to how they would use a traditional legal research platform (whether that platform is free, low-cost or premium)?

    If I’m looking for reliable, human-written annotations, I’ll use WestlawNext. If I’m looking for longer exegesis on an issue, I’ll find a published treatise (as part of a premium legal research subscription, or even free online – such as the treatises published by the Federal Judicial Center) or article (as part of a premium legal research subscription; free online via SSRN, etc.; or via paid access, but located through Google Scholar) written by an expert.

  • Ricardo Barrera

    Yes, very few cases have annotations, so maybe another “failed” crowd-sourced law site? Here’s the url to the image inviting contributions:

  • A variation on a theme:

    “CiteGuru is an innovative research tool for attorneys and researchers in need of legal research. Through the use of crowdsourcing and an open marketplace, users can complete research assignments and be rewarded for their efforts. CiteGuru offers three types of research projects: (1) case citations; (2) case briefs; and (3) research memorandums.”

  • Good questions, Lisa. We now have are actually closer to 450,000 monthly users, and based on our usage statistics, I would definitely say that they are using us for traditional research uses. A lot of it comes down to the fact that the site is fast and easy to use, with accurate search; we have all federal cases and all state appellate cases (as well as many state lower-court opinions), and a growing database of state statutes (California, New York, Delaware, New Jersey, and Florida for now); and have a nice-looking interface. And we’re free 🙂

    But we’re seeing a lot of people using us for the annotations as well, especially as a tool to supplement legal research done elsewhere. It really comes down to which information you’d like to see while you’re doing legal research. I generally feel it’s much better to have access to the viewpoints of practitioners in my field, subject matter experts, and professors who have intensely studied the issues rather than annotations from anonymous editors at big publishers. You’ll often get a much more enriching experience drawing on their viewpoints.

    So, for example, at the end of the Supreme Court term, we had over 50 people contribute viewpoints on the King (Obamacare) and Obergefell (same-sex marriage) cases. This included people like Erwin Chemerinsky and Laurence Tribe, but also the lawyers who argued the case before the Court or the lower courts, amici in the cases, and state solicitors general. That’s the kind of resource

    Check them out — lots of great reading:

    Do we have perfect annotations on every case? No. Right now great annotations tilt more heavily towards “famous” cases. But we are working on ways to bring in more diverse opinions to a broader set of cases. Our site gets better every day, both from a technological standpoint and from the content added, and I encourage you to check in periodically!

    We also don’t think that, in the short term, we’ll be the only place you check.

  • CiteGuru has nothing to to with collaboration. Look carefully:

    “Through the use of crowdsourcing and an open marketplace…”
    “Our researchers will find you the most relevant results.”
    “CiteGuru offers three types of research projects: (1) case citations; (2) case briefs; and (3) research memorandums.”

    It’s nothing more than an outsourced legal research service using collaboration as a buzzword, and as a way to rope in potential clients. See The citeguru website contains no information on the credentials of the people performing the research. Additionally, the list of target customers on Betalist indicates that the company will provide its services to students (presumably including), which contributes to academic dishonesty. identifies CiteGuru’s founder as Bryan Fox of Pittsburgh Startup Law. According to,

    “Pittsburgh Startup Law serves the city of Pittsburgh and the surrounding areas’ growing community of startups and entrepreneurs. Pittsburgh Startup Law is not a traditional law firm, not a one stop shop for copy-pasted documents, and not an impersonal commercial enterprise. By focusing on transparency, easy access, and tailored, cost-effective legal services, Pittsburgh Startup Law is helping local businesses grow and succeed.”

    Transparency, huh?

    • bryanfox

      Hi Lisa Soloman, I’m Bryan Fox, the creator and founder of Citeguru, Pittsburgh Startup Law, etc. and I occasionally, from time-to-time like to google projects that I work on. I came across your post here and felt obliged to comment (even if it’s a year later) since many of the things you write aren’t fair representations of me or what I’m trying to accomplish. And really, some of the things you wrote are just wrong.

      Firstly, Lisa, since you seem curious about these kinds of things: I keep my law practice and projects like Citeguru separate- legally, financially, and otherwise. Doing so isn’t cryptic or misleading. It’s to insulate and keep each business protected from potential liability and other associated problems that crop up over time in the life of a business.

      I appreciate that you took time to research my background on, where I personally published those pages about me and the projects I’m associated with. I didn’t bother looking you up, but feel free to shoot me an email or get in touch some other way if you’d like to chat sometime about backgrounds.

      I’m not trying to “rope in potential clients” as you put it. Actually, for Citeguru, I’m trying to assist attorneys and other legal professional in better serving *their* clients or growing *their* business. That’s what Citeguru is for. Not an advertising gimmick for my law firm…

      Also, what’s with the academic honesty claim? Are you serious? You really think that and similar accusations are appropriate to make with zero knowledge of a beta platform. Yikes.

      I’m willing to engage you in a discussion on the merits of how “crowdsourcy” Citeguru is or isn’t. But not the other stuff you made up.

      Anyways, just wanted to chime in for a few paragraphs. If you or anyone are interested in learning more about what Citeguru *actually* is instead of reading semi-libel from Lisa Soloman, please feel free to reach me at or visit


      Bryan A. Fox

  • Fantastic post, Bob. I think this is the most accurate and complete summation of crowdsourcing in law and legal research that’s been written so far, and I think you hit the “keys to success” in your conclusion on the nose.

    The fact that so many sites have tried but haven’t yet succeeded is, to me, a sign that many people acknowledge that both there’s value here, but it’s also really hard to get right. I’ll keep you posted on developments as we try to make contributing easier, more rewarding, and useful to our researchers.

  • Lisa raises a really good question — are lawyers not contributing because they are busy professionals, and so all crowdsourcing in law is doomed to fail?

    I think no, for a few reasons.

    First, thousands lawyers already contribute their wisdom for free — in the form of blogging. The ABA did a study that found around 24% of law firms maintain blogs, while 7-10% of individual lawyers maintain their own blogs for professional purposes ( And I can tell you from when I practiced that these blogs are REALLY good — some of the best information accessible anywhere, paid or not. The reason? Many are doing it to market their firms or themselves, and by sharing knowledge and insights for free, they’re finding that they get far more interest — and clients — than by keeping quiet. We hope to tap into that energy and interest and provide people with another great outlet to reach a very wide user base (now around 450,000 people per month).

    Second, some people are really passionate, and despite being really busy, they want to make a difference. Josh Lee, a federal defender in Arkansas who is an anti-death penalty crusader, is one great example: His analysis on death penalty issues is fantastic, and he’s also shared a number of really great briefs in our briefs database.

    Third, we’ve seen it work in other professions. My favorite example is Stack Overflow, a site for engineers to ask and answer difficult programming questions ( I employ engineers, and I can verify that they’re as busy as lawyers, and answering questions is really tough work. But the site taps into a mix of incentives (including the potential to land a great job due to one’s involvement and prowess in answering questions) and a sense of community that makes it one of the best resources for programmers.

    Finally, the proof is in the pudding. We’ve seen an amazing growth of contributions to Casetext in the recent past, especially in certain subjects. Check out, for example, the posts being added in the employment law (, criminal law (, and Torts & Products Liability ( communities. Could there and will there be more? Yes. But we’re encouraged by what’s been happening so far.

  • [I wish the “reply” feature was working to chain this discussion]

    Casetext is doing a fantastic job, no doubt about it. Still, I doubt that many cases outside the newsworthy ones and perhaps cases in areas that a few people are particularly passionate about (such as some areas of criminal defense) will end up with annotations.

    • Lisa — I hadn’t realized that the nested comments wasn’t working. It’s supposed to be. I’ll try to get it going.

  • I hope we pleasantly surprise you, Lisa 🙂

  • Bob, great article. When we were launching Ravel in 2012 and following the’s design process of intensively engaging with users (lawyers and students), we heard two things again and again:

    1. Lawyers were not comfortable sharing with the outside world (and sometimes not even within their firm)
    2. They were already drowning in information and looking for ways to cut through the clutter rather than have even more to sift through

    To riff on the secret sauce you identified, two specific challenges are that the reward scheme must overcome #1 and the “useful content” must address #2.

  • I have long wondered why nobody seemed to want to participate in crowdsourced legal platforms. And after working for one of the bigger legal tech companies, and having them pass over my idea for adding annotation to caselaw and statutes (good thing they did, in retrospect), I started following the other companies that do crowdsourced legal annotation.

    Today, I’m a solo attorney. And I haven’t once used one of those sites. Why? They make no effort to appeal to me.

    The folks who will build out these databases with crowd-sourced annotations and commentary are SmallLaw. BigLaw folks can afford the larger databases with professionally curated citations and treatises. SmallLaw is always looking for a way to cut costs. And if they use your site for their own research, they’re probably more likely to add an annotation or two along the way. SmallLaw operates on a hyperlocal level — what is the state law and how do my local courts interpret it?

    Casetext is the closest to what I would look for: their communities at least narrow by practice area. But I wouldn’t spend time on a general family law discussion forum or online community — I’d look for a state-specific community, as California Family Law (a community property state) is drastically different than, say, North Dakota Family Law. I have yet to find anything local, other than email listservs.

    A potential roadmap: hook up with local bar associations and start communities with them. Build from the local level on up.

  • Hi,

    Have you taken a look at This platform enables participation on the drafting of laws. Examples are: Chile: and Iraq:

  • The Legal Productivity blog ( has information about a new app called Aptorney that purports to “encourage” users to annotate primary sources, with “practicing attorneys review[ing] annotations several times per week.” There are so many issues with this app (such as no indication of dates of caselaw coverage, the creator has no active website, no information about who’s behind the app.) that the certain failure of the crowdsourced annotations “feature” is the least of its problems.