Ruhl and Katz: How Complex is the Law, and How Complex Should It Be?

Legal Informatics Blog - Sat, 04/12/2014 - 04:48

Professor Dr. J. B. Ruhl of Vanderbilt University and Professor Dr. Daniel Martin Katz of Michigan State University and the ReInvent Law Laboratory, presented a paper entitled How Complex is the Law, and How Complex Should It Be?, at SEAL XV: Conference of the Society for Evolutionary Analysis in Law, 4-5 April 2014, at the University of Illinois College of Law, in Champaign, Illinois, USA.

Here is the abstract, from the conference program:

Can we measure the complexity of law (and if so how and what would we do with that knowledge)? Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. While this work is illuminating, for the most part, legal scholars have skipped the hard part — developing quantitative metrics and methods for measuring and assessing law’s complexity. This paper explores the empirical and normative dimensions of legal complexity at a depth not previously undertaken in legal scholarship — including identifying useful metrics and methods for studying legal complexity.


Filed under: Abstracts, Applications, Articles and papers, Conference papers, Methodology, Research findings Tagged: Complexity of law, Daniel Martin Katz, J. B. Ruhl, Legal complexity, Measuring legal complexity, Measuring the complexity of law, SEAL, SEAL 2014, SEAL XV, Society for Evolutionary Analysis in Law, Statistical methods in legal informatics
Categories: Teknoids Blogs

Ruhl and Katz: How Complex is the Law, and How Complex Should It Be?

Legal Informatics Blog - Sat, 04/12/2014 - 04:48

Professor Dr. J. B. Ruhl of Vanderbilt University and Professor Dr. Daniel Martin Katz of Michigan State University and the ReInvent Law Laboratory, presented a paper entitled How Complex is the Law, and How Complex Should It Be?, at SEAL XV: Conference of the Society for Evolutionary Analysis in Law, 4-5 April 2014, at the University of Illinois College of Law, in Champaign, Illinois, USA.

Here is the abstract, from the conference program:

Can we measure the complexity of law (and if so how and what would we do with that knowledge)? Legal scholars have begun to employ the science of complex adaptive systems, also known as complexity science, to probe these kinds of descriptive and normative questions about the legal system. While this work is illuminating, for the most part, legal scholars have skipped the hard part — developing quantitative metrics and methods for measuring and assessing law’s complexity. This paper explores the empirical and normative dimensions of legal complexity at a depth not previously undertaken in legal scholarship — including identifying useful metrics and methods for studying legal complexity.


Filed under: Abstracts, Applications, Articles and papers, Conference papers, Methodology, Research findings Tagged: Complexity of law, Daniel Martin Katz, J. B. Ruhl, Legal complexity, Measuring legal complexity, Measuring the complexity of law, SEAL, SEAL 2014, SEAL XV, Society for Evolutionary Analysis in Law, Statistical methods in legal informatics
Categories: Teknoids Blogs

My Twitter Digest for 04/10/2014

<CONTENT /> v.5 - Fri, 04/11/2014 - 14:30
Categories: Teknoids Blogs

Heartbleed: What Lawyers and Law Firms Need to Know

The Lawyerist - Fri, 04/11/2014 - 12:26

Yesterday, partially in response to news about the “Heartbleed” computer exploit, Sam wrote a post about the importance of lawyers understanding how the internet works. Given all the media buzz about Heartbleed, I thought it might be useful for lawyers and law firms to understand what it really means for them, without either too much techno-jargon or over-use of dumbed-down metaphor.

So What is Heartbleed?

Leaky website encryption.

Lots of websites that require password log-in use an encrypted connection to your browser, called SSL. You can see this when you go to sites that have an “https” website prefix, as opposed to the normal “http” prefix—the “s” means they’re using encryption to protect the data sent between you and that website.

One version of SSL is an open-source software called “OpenSSL”. For the past two years, the OpenSSL software has had an unknown bug in its code that could have allowed people to see what was supposed to by encrypted data passing between you and the websites using OpenSSL.

“Heartbleed” is just the creative name—given by internet security researchers—to identify the software bug in OpenSSL that allowed for this potential encryption leak.

How Did Heartbleed Happen?

By accident.

Because OpenSSL is an open-source software project, volunteer software developers around the world are able to submit suggested code edits and fixes, which can later be incorporated into the core software. Two years ago, a German software developer submitted some code fixes—intending to clean up some small software bugs in OpenSSL—and accidentally created a new, unnoticed, bug—now called “Heartbleed”.

What Sites Are Impacted by Heartbleed?

Most of the big ones.

There are two ways to think about the potential impact of Heartbleed: direct impact and indirect impact.

The direct effects of Heartbleed involve theoretical access to your private data on sites that use the OpenSSL encryption code. These are usually “medium security” sites that require a password log-in and/or process payments.

  • Low security sites: Websites that don’t require log-in and don’t process payments rarely use SSL encryption and thus would not be directly impacted by Heartbleed.
  • Medium security sites: Non-financial-services websites that use log-ins and/or process payments AND use the OpenSSL software are the sites that were impacted. This includes Facebook, Google, Twitter, Yahoo! and more. You can find a list of major sites impacted by Heartbleed here.
  • High security sites: Most financial services websites (banks and credit card companies) have stronger encryption standards than OpenSSL and thus also aren’t directly impacted by this.

The broader indirect effects of Heartbleed involve the fact that many people use only a small number of (bad) passwords across the internet, which means that access to one of these passwords through the Heartbleed exploit could give someone access to additional sites using the same password.

Did Hackers Steal My Passwords or Client Files or Other Important Data?

Hopefully not.

Unlike the Target data breach last fall, Heartbleed was identified and announced before any known attacks occurred. Computer security researchers discovered the code problem last week and announced it immediately. Developers immediately started building software patches to fix the problem. Most effected sites have already implemented these fixes or will in the next couple of days.

It is certainly possible (maybe even probable) that in the past two years—since the creation of the “Heartbleed” code—a malicious hacker or espionage organization has been collecting and exploiting the vulnerability, but there isn’t currently any evidence that this happened to anyone.

UPDATE: It now appears—surprise to anyone?—that the NSA has known about Heartbleed for two years and didn’t tell anyone, because it’s been giving them easy access to otherwise-encrypted data.

What Should I Do About It?

Related“Encryption: Enabling Basic Client File Security”

Use better passwords.

  • Minimum: Change your passwords today as soon as the sites you use are patched. If you do nothing else, use the list of vulnerable sites above and change all of your passwords on those sites. You really, really need to this today. That is the absolute bare minimum, though, and probably not enough to satisfy your ethical duties as an attorney.
  • Best practice: use a password manager and encrypt and back up you hard drive. There are four fairly-simple steps lawyers (and everyone else) can take to dramatically increase their data security.
    1. First, encrypt your hard drive. This takes just a few minutes and is usually free. By encrypting your hard drive, you secure your physical computer from snooping.
    2. Second, back up your hard drive. You’ll have to decide whether to use a file syncing tool like Dropbox or a pure back-up service like CrashPlan, or both, but your data should be backed up to a computer or server that is not in your office.
    3. Third, use a password manager. Password managers like LastPass, 1Password, PasswordBox, and KeePass allow you to create and manage unique, strong passwords for each of your website log-ins. Rather than having to memorize lots of different passwords for all of your sites (or worse, but more common, using the same password for all of your sites), password manager software generates super-strong passwords for each of your sites then stores them in an encrypted file that you access with one master password.
    4. Fourth, turn on two-factor authentication. Many web services (Google, Dropbox, etc) allow users to add “two-factor authentication” to their log-ins. This means that when you sign in, in addition to your username and password, you also need to input an additional piece of information—usually a code the site texts to you as you log in. This way, if anyone ever did obtain your password, not only would they not be able to log in, your phone would alert you that they were attempting to get in.
After This and the Target Data Breach, Should I Fear the Cloud?

Related“It’s Time for Lawyers to Re-Think the Cloud”

No, but maybe.

Fear of things you don’t understand isn’t a particularly useful thing. The “cloud” (software and data that is stored on servers outside of your location that you access through the internet) is a complex and changing thing. This complexity allows for some truly amazing innovations in technology, but also comes with potential risks.

Lawyers have a particularly-strong duty to understand what is happening with their confidential client data.

A good understanding of how the cloud—and a law firm’s particular web applications—works should also include a good understanding of the variety of ways that lawyers and law firms can protect themselves from risk.

Proper, rational risk analysis requires learning about the likelihood and magnitude of potential harm, as well as the cost and burden of both possible security measures, but also the alternative options. For instance, if your “fear” of the cloud leads you to keep everything in paper form, you are almost certainly leaving your important client data at greater risk to theft, fire, flood, or snooping than if you use best practices in the cloud.

That said, this analysis is very dependent on your particular circumstances.

What’s next?

Nobody knows.

Here’s the reality: stuff like this (and probably worse) is inevitable. As the sophistication of web and mobile applications grows so do the methods of hackers and espionage operations. Similarly, increasing reliance upon and interactivity between these apps makes your data more vulnerable to hacks and bugs.

Who knows whether the next big internet security news with be a big data breach, a code exploit, a hack into one of your favorite websites, or something totally unforeseen. The question isn’t whether there will be security problems on the internet, but whether you are being smart about how you use technology to keep yourself as secure as possible.

It is legitimate to question whether these tradeoffs are worth it for your particular situation, but that requires education of what’s really going on, and a rational analysis of the costs and benefits of technology use and data security protocols, not just a resort to fear and doubt.

Heartbleed is a big deal in internet security, but hopefully its biggest effect will be in getting you to use more care in how you protect yourself online.

Featured image: “Businessman in suit puts his head down on his laptop computer ” from Shutterstock.

Heartbleed: What Lawyers and Law Firms Need to Know is a post from Lawyerist.com. The original content in this feed is © 2013 Lawyerist Media, LLC. This feed is provided for private use only and may not be re-published.

Categories: Teknoids Blogs

The Friday Fillip: Letting Go . . . Hopefully

slaw - Fri, 04/11/2014 - 06:00

Context is everything. Which is simply to say res non ipsa loquitur. Things need more things near them to achieve meaning, significance, import — perhaps even for us to see them. That lump of metal there . . . near the edge of a smoking crater? or on a pedestal in a room hung about with paintings? Big diff.

Now, lumps of metal don’t have as their primary function the carrying of meaning. Words, though, do. Which is to say context is routinely consulted when we utter. Goes without saying — without awareness, most of the time. Some folks, however, have to be a little more conscious than others when it comes to words and their meanings. Lawyers, for example, aren’t allowed to hit each other or judges, and so must fall back on language, particularly written language, in order to get their jobs of persuasion and agreement done.

For common lawyers particularly the tablet on which they write is almost never rasa. It sits atop a great pile of verbiage stretching back in time (and, indeed, far and wide at the same time), and this “long tail” of usage impinges on the meaning of what they write or say today. This is true of all of us all the time, of course, but lawyers — and writers — know this more acutely.

All of which is a long-winded way of saying that when the past– the deep past, not just yesterday — is an important context, there’s a problem with change. “Past knowers,” historians, if you like, can be slower to come into line with the rest of us as we go on our merry way inventing new meanings for old letter clumps or even whole new clumpages (often where none was needed). I’m a resister, though I try to give in gracefully when I must. But I still say “normality” rather than “normalcy,” even though the latter was a word U.S. president Warren G. Harding came up with way back in 1920. And I can’t yet let “unique” mean just “pretty special.”

That last example illustrates one kind of change that seems to happen as words become untethered from their pasts and float into new and wider contexts: sharper meanings become more diffuse. Makes sense, of course. People see a nifty or shiny term in some technical context, and either because it’s pleasing or, which is more likely, because technical means important and who doesn’t want a piece of that, the term gets lifted and blanded.

  • “Gambit” comes from chess, where it means the sacrifice of a piece early in the game in order to gain later advantage; now it just means any tricky or other attempt to gain advantage.
  • “Dilemma” comes from rhetoric — argumentation — where it means an attempt to compel your opponent to choose between (only) two equally unpalatable alternatives (the “horns” of the dilemma); now it means any difficult problem.
  • “Decimate” comes from the ugly Roman practice of killing off a tenth of a mutinous cohort (pour encourager les autres); now the sense of proportion is gone and it just means a large amount of removal or destruction. (“Cohort,” incidentally, has gone two ways from its original meaning of a group of 480 Roman soldiers: to any group united by a feature or purpose; and down to a single “friend” or companion.)
  • “Disinterested” in law still means, or should mean, without a pecuniary or other tangible interest in the outcome of a matter, something our judges should be; but everyone else has grabbed it and uses it to mean “not caring,” leaving “uninterested” high and dry.

This movement from the particular to the general, again something that common lawyers are perfectly familiar with, happens to sayings too. So, for example, “the lion’s share” used to be a mild witticism and meant everything; now it means only “most” or “the largest portion.” To use the “carrot and stick” tactic was once much more sophisticated and meant to dangle a carrot off the end of a stick just out of the poor donkey’s reach, so that the beast would strive forever forward; now it just means the crude pairing of reward and punishment. 

Some stuff from technical sources is a bit hard to understand, so that it’s not surprising that magpies use the shiny bits in all sorts of less precise ways. Again from rhetoric, “begging the question” is a little tricky, meaning to assume the answer within the question, much like a leading question in advocacy. But in general usage it’s taken to mean simply requiring or inviting a question. The phrase “steep learning curve” means to describe a particular plot on a graph where the x axis measures time and the y axis measures the degree of learning acquired; in such a setup, a steep curve means that a lot of learning takes place rapidly, exactly opposite to the meaning that most people give it, they presumably seeing it in the context of climbing, where steepness is difficult and is conquered only slowly. And lately I’ve seen a lot of people admired as “over-achievers,” a label I’m not sure I’d want, coming as it does from psychology, where it means to describe someone who has accomplished more than would have been expected of them, given their (limited) intelligence, the opposite of an “under-achiever.”

At some point, of course, the law is seen to have in fact changed and to have let go of old contexts. I will, at some point, let go of the oldest texts and their use as contexts, as I’ve already done in myriad ways I’ve already forgotten, I’m sure. It’s just that like the law, I let go slowly and a little reluctantly. But I’ll be along soon. Hopefully.

Categories: Teknoids Blogs

Dealing With Link Rot – Are DOIs the Cure?

slaw - Fri, 04/11/2014 - 06:00

Over the past decades the publishing industry has developed standards to provide unique identifiers to text products. The most well known is the ISBN, the International Standard Book Number, which now comprises 13 digits, and ensures the same titles published in different parts of the world can be identified separately. The version used for periodicals is the 8 digit ISSN – International Serial Serial Number. Then there is the International Standard Text Code (ISTC), a numbering system for the unique identification of text-based works, which links different text works within books, audio books, etc. All of these standards ensure when an author or purchaser is citing a work, the original work would be correctly identified, and confusion between similar titles, for example, would be avoided.

These standards are not applicable to citations using links from the internet. The issue of link rot, where a link used in an article which was written some years ago may no longer link to the original piece, is becoming more prevalent as time passes. The problem has been addressed by others, including blogs written on SLAW, and there are studies and reports outlining the problem. If you write an article, and provide URLs in your footnotes, chances are that these URLs may change over time. Servers move, websites upgrade, websites are redesigned and take on new URL identities. At the time of writing your piece, you are providing the reader with a great service, the chance to link directly to a source you have quoted. But with the passage of time, if the URL changes, short of republishing your article with new links, there is little that you, as the author, can do.

There are several solutions offered to counter the problem, using the concepts of international standards, similar to the ISBN. These involve allocating a permanent link to an article. One service, (with which we at the law library have affiliated) is called Perma, and was discussed by Simon Fodden on SLAW last year . It aims at authors, allowing them to create a permanent archival link to their articles, and for these links to be included in law reviews, and has buy-in from many prominent law faculties and institutions.

A project like Perma is well intentioned, seeking to achieve a consistency for law related articles, and giving the author a role. This may be most useful where the journal itself does not automatically create permanent links using an accepted standard already.

However, a number of registration agencies exist which use the DOI standard of a structured character string to provide unique digital object identifiers to all manner of digital objects; mainly journal articles, but also book chapters, video, and theses, etc. The DOI becomes the persistent citation to an article. Many publishers have adopted the DOI standard and this citation can be found alongside the titles in the journal.

There is one registration agency I thought it might be useful to mention, CrossRef, formed as an association of scholarly publishers which now includes libraries, standards organisations, NGOs, IGOs, etc. They are catholic in our definition of “publisher”. Established in 1999, its aim, as a not-for-profit organisation, is to facilitate scholarly communications. They offer several services to publishers, such as CrossCheck, software to prevent scholarly as well as professional plagiarism. The have been issuing DOI standard permanent links for journal articles for some time.

From a presentation on their website, they state:
“A DOI persists throughout changes in copyright ownership or location because it’s just a name used to look up an address in an easily updateable directory”

With over 65 million items indexed with a DOI and a free look up section, CrossRef has proven to be an invaluable tool for us recently. We needed to urgently obtain persistent links to a couple of hundred journal articles written by our academics, which were submitted as part of the Research Excellence Framework – REF – exercise universities undergo every few years in the UK. We had titles and authors, and sometimes the ISSN of a journal, but not all the hard copies of the journals, nor even subscriptions to all the titles.

On the look-up section we were able to locate the DOI – which looks like this example http://dx.doi.org/10.1017/S1472669612000643 – using titles or authors.

It is also possible to look up the actual DOI on the CrossRef inquiry page, using only the digits: 10.1017/S1472669612000643. CrossRef now provides a lookup for a batch of multiple references which is a useful addition.

The next step will be for researchers to encourage their publishers who do not yet do so to implement DOI on their publications, which can then be confidently used by authors when citing to a journal article.

But of course this does not tackle the issue of online legislation, government reports, and cases. There is a need to include these digital objects as well; those websites also change over time. Government departments in the past have been the worst offenders, not archiving reports, etc consistently on the internet, and making wholesale changes to websites when governing parties change at election time. And how nice would it be to have this sort of standard available within databases such as Lexis or Westlaw, especially for cases where the citation is to the paper volume – e.g. [1932] A.C. 562 – and not a digital citation. The URL in Lexis Library for me when I access this case is currently: http://www.lexisnexis.com/uk/legal/search/enhRunRemoteLink.do?
lexisReco=true&A=0.36118831000175833&bct=A&
service=citation&linkInfo=F%23GB%23AC%23PAGE%25562%25YEAR
%251932%25&langcountry=GB&ersKey=23_T19419781734
&backKey=null&recommendsType=LexisRecoCitation
Suggestions&lexisReco=true
How user friendly is that?

CrossRef are currently testing a service that will allow one to assign a DOI to any content on the web. The pilot, tentatively called “OpCIt” was discussed at their annual meeting in 2013. This may be an answer to the link rot problem.

There is an urgent need to have the same reliability of persistent and consistent access to the digital version of cases, government reports and publications as we have to our physical, leather bound volumes, such as 32 Henry VIII, (ie, the acts passed in 1540 in England), a citation unchanged by the passage of time, the object still fully legible, and through its class mark, easy to locate on an assigned shelf in our physical library.

Categories: Teknoids Blogs

The ‘Heartbleed’ Bug and How Internet Users Can Protect Themselves

The Chronicle Wired Campus - Fri, 04/11/2014 - 03:57

Security professionals working in higher education are updating servers, reissuing certificates used to guarantee secure Internet transactions, and encouraging students and faculty and staff members to take a break from the commercial Internet following the discovery of a programming flaw in a widely used Internet tool.

Dubbed “Heartbleed,” the Internet-security breakdown cuts across industries and has raised anew questions about the vulnerability of proprietary data and personal information shared online.

In an email interview with The Chronicle, Steven Lovaas, information-technology-security manager at Colorado State University, laid out the basics and described what Internet users can do to protect themselves.

Q. What’s all this about Heartbleed?

A. On April 7, researchers found a flaw in one of the tools used to secure Internet traffic. That tool, called OpenSSL, is responsible for providing security on the Internet. The bug allows an attacker to capture usernames, passwords, and pretty much any other information.

Q. Why does this matter?

A. This is a big deal. Much of the Internet relies on OpenSSL to protect secure traffic. At least 500,000 servers worldwide appear to be affected by the bug, and some personal computers and mobile devices are also affected. Until the bulk of affected computers are fixed, or patched, any secure site on the Internet is potentially dangerous to visit.

Q. What are colleges doing to respond?

A. The higher-ed community has been pretty impressive in its willingness to work together and share information to get this solved. So far, I don’t see nearly as much of that in the private sector. CSU has patched all our vulnerable servers that are exposed to the Internet. We’re also working on hunting down all internal servers that are still vulnerable, and will be getting those all patched very soon. We’re monitoring the situation, and we’ll notify owners of any additional affected computers we find. [See here, here, and here for some other universities' responses to the situation.]

Q. What should I do?

A. First off, don’t panic. While this is a serious vulnerability, security folks at CSU and around the world are working around the clock to reduce the risk. Nevertheless, there are some things you can do while the world catches up:

  • Avoid online banking and shopping for a few days, if you possibly can.
  • Don’t change your online banking password until your bank tells you that it’s OK; otherwise you may just be giving attackers your new password.
  • Be very suspicious of any emails asking you to change passwords.
  • Remember that legitimate college emails will never ask you to respond with sensitive information such as password, Social Security number, or bank-account number.
  • Apply the latest security updates to your home and work computers, as well as to your mobile devices.

For more information about Heartbleed, this piece on public radio’s Marketplace is a good place to start.

Categories: Teknoids Blogs

AL Asswad et al.: Definition Extractions from Code of Federal Regulations

Legal Informatics Blog - Fri, 04/11/2014 - 02:47

Dr. Mohammad M. AL Asswad of the Legal Information Institute, and Deepthi Rajagopalan and Neha Kulkarni of Cornell University, presented a poster entitled Definitions Extractions from Code of Federal Regulations, at Cornell University’s BOOM 2014 competition.

The poster won an award at the competition, according to a post at the Legal Information Institute.

Here is a description of the poster:

Working collaboratively with Cornell Law School students Alice Chavaillard and Rodica Turtoi, the team developed software that uses natural language processing and machine learning techniques [balanced random forest] to identify sections of federal law that define important terms. In this collaborative project, the Cornell Law students served as domain area experts and helped to produce the data needed to train the computers to classify a paragraph of text as a definition or non-definition. The engineering team then wrote software that determines the scope of the definition (where the definition applies), parses out the defined terms, and finds the boundaries of definitions that are long and complex. Once defined, the definition may be linked to other parts of relevant regulations. So when you find the term water in your particular regulation, you can click the term to be taken to the specific definition of water that applies to you, whether the definition resides in that regulation or in another section of the law.[...]

For more details, please see the poster and the announcement.

HT @LIICornell


Filed under: Applications, Posters, Technology developments Tagged: Alice Chavaillard, Automated legal text processing, Balanced random forest, BOOM, BOOM 2014, CFR, Code of Federal Regulations, Cornell Law School, Deepthi Rajagopalan, Extracting definitions from legal text, Legal Information Institute at Cornell University, Legal machine learning, Legal natural language processing, Legal text extraction, Legal text mining, Legal text processing, Legislative information systems, Machine learning and law, Mohammad AL Asswad, Natural language processing and law, Neha Kulkarni, Regulatory information systems, Rodica Turtoi
Categories: Teknoids Blogs

AL Asswad et al.: Definition Extractions from Code of Federal Regulations

Legal Informatics Blog - Fri, 04/11/2014 - 02:47

Dr. Mohammad M. AL Asswad of the Legal Information Institute, and Deepthi Rajagopalan and Neha Kulkarni of Cornell University, presented a poster entitled Definitions Extractions from Code of Federal Regulations, at Cornell University’s BOOM 2014 competition.

The poster won an award at the competition, according to a post at the Legal Information Institute.

Here is a description of the poster:

Working collaboratively with Cornell Law School students Alice Chavaillard and Rodica Turtoi, the team developed software that uses natural language processing and machine learning techniques [balanced random forest] to identify sections of federal law that define important terms. In this collaborative project, the Cornell Law students served as domain area experts and helped to produce the data needed to train the computers to classify a paragraph of text as a definition or non-definition. The engineering team then wrote software that determines the scope of the definition (where the definition applies), parses out the defined terms, and finds the boundaries of definitions that are long and complex. Once defined, the definition may be linked to other parts of relevant regulations. So when you find the term water in your particular regulation, you can click the term to be taken to the specific definition of water that applies to you, whether the definition resides in that regulation or in another section of the law.[...]

For more details, please see the poster and the announcement.

HT @LIICornell


Filed under: Applications, Posters, Technology developments Tagged: Alice Chavaillard, Automated legal text processing, Balanced random forest, BOOM, BOOM 2014, CFR, Code of Federal Regulations, Cornell Law School, Deepthi Rajagopalan, Extracting definitions from legal text, Legal Information Institute at Cornell University, Legal machine learning, Legal natural language processing, Legal text extraction, Legal text mining, Legal text processing, Legislative information systems, Machine learning and law, Mohammad AL Asswad, Natural language processing and law, Neha Kulkarni, Regulatory information systems, Rodica Turtoi
Categories: Teknoids Blogs

Publications Nominated for the 2014 Hugh Lawford Award for Excellence in Legal Publishing

slaw - Thu, 04/10/2014 - 15:24

Every year, the Canadian Association of Law Libraries (CALL) hands out the Hugh Lawford Award for Excellence in Legal Publishing.

It honours a publisher (whether for-profit or not-for profit, corporate or non-corporate) that has demonstrated excellence by publishing a work, series, website or e-product that makes a significant contribution to legal research and scholarship.

The nominees for this year are:

  • The Queen’s Bench Rules of Saskatchewan: Annotated, 4th ed. (Law Society of Saskatchewan Libraries)
  • Juris Classeur Québec (LexisNexis Canada)
  • Copyright Law, Fourth Edition (John Wiley and Sons, Inc.)
  • GALLOP: Government and Legislative Libraries Online Publications Portal (Association of Parliamentary Libraries in Canada/ L’Association des bibliothèques parlementaires au Canada)

The award honours Hugh Lawford (1933-2009), Professor of Law at Queens’ University and the founder of Quicklaw.

The award will be presented to the recipient at a reception during the 2014 CALL Annual Meeting in Winnipeg in late May.

Slaw.ca received the award in 2009.

Categories: Teknoids Blogs

My Twitter Digest for 04/09/2014

<CONTENT /> v.5 - Thu, 04/10/2014 - 14:30
Categories: Teknoids Blogs

QuickWire: Contractor Says He Hacked Maryland Network to Expose Flaws

The Chronicle Wired Campus - Thu, 04/10/2014 - 13:25

A former employee of a University of Maryland contractor has told The Baltimore Sun that he breached the university’s network in an effort to highlight cybersecurity flaws that he said were being ignored.

The employee, David Helkowski, was with a company hired to work on a university website when, he said, he noticed and reported security flaws. When no action was taken to correct them, he said, he took administrators’ information, including President Wallace D. Loh’s Social Security and cellphone numbers, and posted it online. He is now under FBI investigation.

Maryland publicly reported the incident on March 20. It followed on the heels of a major breach at the university in February in which the records of 287,580 students and staff were stolen.

Categories: Teknoids Blogs

The Internet: a Primer for Lawyers and Everyone Else

The Lawyerist - Thu, 04/10/2014 - 10:52

The new media site, Vox, is quickly establishing itself as a sort of FAQ (frequently-asked questions) for just about everything you might want to know about current events. And its “Everything you need to know about the Internet” article/resource page/deck of “cards” is a must-read.

Remember the comment to Rule 1.1:

[8] To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology

That “should” makes the comment aspirational, sure, but it more than suggests that competent lawyers must know more than where to find the power button on a CPU. A great deal more than you will find in the Vox article, of course, but it does contain essential information for lawyers. It’s hard to appreciate the true threat posed by the Heartbleed security flaw, for example, if you don’t have a basic understanding of the role SSL plays in your everyday Internet use. Or the risks of using the cloud if you don’t know what the cloud is.

“If you don’t have a basic understanding of the technology you entrust with your clients’ information … I think you probably are not competent to represent anyone.”

Because a basic understanding of the Internet is so essential to the tools we trust with our clients’ information every day, I’m comfortable going a bit further than the comment to Rule 1.1. If you don’t have a basic understanding of the technology you entrust with your clients’ information so that you can make informed choices about security, I think you probably are not competent to represent anyone.

That’s just my opinion. But like your computer, the Internet is now a tool of the lawyering trade. Not knowing how it works is just as much a problem as not knowing how to put together a brief.

So go read the Vox article. (Then, encrypt your files already.)

The Internet: a Primer for Lawyers and Everyone Else is a post from Lawyerist.com. The original content in this feed is © 2013 Lawyerist Media, LLC. This feed is provided for private use only and may not be re-published.

Categories: Teknoids Blogs

Change to Citation Format for Consolidated Quebec Laws and Regs

slaw - Thu, 04/10/2014 - 08:25

The Government of Quebec has announced that effective April 2014, the proper citation format for Quebec laws and regulations derived from the consolidated collection will be RLRQ. The previous abbreviation was LRQ (statutes) and RRQ (regulations).

The new policy can be viewed by clicking on this link.

See: la Gazette officielle du Québec, partie 2, (2 avril 2014, no 14, pg 1303): Politique sur le recueil des lois et des règlements du Québec.

This revised Policy replaces the Politique sur le Recueil des lois et des règlements du Québec, published on January 3rd, 2013.

SOQUIJ users will be able to search using any of the abbreviations noted above.

Categories: Teknoids Blogs

Ontario’s Ministry of Labour Targets Employers Using Unpaid Internships

slaw - Thu, 04/10/2014 - 08:00

From April to June 2014, the Ontario Ministry of Labour is conducting an employment standards inspection blitz targeting organizations that employ unpaid interns. The goal is to ensure worker rights are protected and enhance employers’ awareness of their responsibilities.

The law

Ontario’s Employment Standards Act (ESA) does not mention interns and unpaid internships by name, but employers must understand that the law does discuss them. The Act specifies that an employee (a person that receives wages for services or work performed or offered to an employer) includes a person who receives training from an employer. In other words, most people that work for an employer qualify as employees, with two strict exceptions.

One, a person who meets all of the following conditions is not considered an employee:

  • The training is similar to that which is given in a vocational school
  • The training is for the benefit of the individual
  • The person providing the training derives little, if any, benefit from the activity of the individual while he or she is being trained
  • The individual does not displace employees of the person providing the training
  • The individual is not accorded a right to become an employee of the person providing the training
  • The individual is advised that he or she will receive no remuneration for the time that he or she spends in training

And two, the Act doesn’t apply to approved internship or co-op programs provided by post-secondary schools.

The aim of the first exception is to prevent employers from using unpaid workers in place of paid workers. The aim of the second is to encourage employers to provide necessary practical experience to students.

However, we have seen over the years that many, if not most, employers using unpaid interns do not fit these exceptions. If they do not, they must pay their interns.

Enforcement blitz into illegal unpaid internships

The issue of unpaid work has been a hot topic for quite a while now, and despite efforts by authorities to clarify the legal status, the use of illegal unpaid interns has continued. Although the Ministry of Labour and Statistics Canada do not have data on Canadian internship programs, according to a Toronto Star article, there are an upwards of 300,000 unpaid internships in Canada and 100,000 of them are off the record with no workplace safety training.

Advocates and opponents of unpaid internships, including lawyer Andrew Langille on his blog Youth and Work and on Twitter @youthandwork, have been calling on the Ministry of Labour to stop the illegal practice. Several complaints were filed and finally the ministry is acting.

Businesses across a variety of sectors known for having unpaid internship programs, such as marketing/public relations, software development, retail, media, film, entertainment are being inspected through a proactive enforcement blitz to check compliance with core ESA standards.

The Ministry of Labour took action on two big known offenders before the enforcement blitz, inspecting Toronto Life (published by St. Joseph Media) and The Walrus magazines. The ministry found that these magazines’ unpaid internship programs failed to meet Ontario’s employment standards; they were not approved internship or co-op programs associated with post-secondary schools and the interns were being used in place of paid workers. Inspectors issued compliance orders for violations of several standards: wage statements, record keeping, minimum wage, public holiday pay and vacation pay. This means that, pending any appeal, the workers involved have to be paid.

Instead of modifying these programs to fit the law, these magazines promptly responded by shutting down their internship programs and kicking all their interns out the door. And news of the ministry’s action spread quickly, with companies like Rogers (Flare and Chatelaine magazines), Canadian Geographic, Quill and Quire, Fashion, The Grid and others shutting down their internship programs as well. Generally, they are claiming that they lack funds to pay interns.

From the interns’ perspective, many are saying that these unpaid internships are not meeting their intended goal, which is to provide the skills and experience they need in a required field and references to get a job. Rather, they say that employers are taking advantage of them and they are often doing menial work that has nothing to do with the training they expected or require.

What should be done?

Though late in coming, the Ministry of Labour’s enforcement blitz is a good step forward. But more needs to be done. Employers wanting to establish internship programs should not be relying on exemptions in the law as a guide. The Ontario government needs to amend the Employment Standards Act to directly mention interns and internship programs and establish clear guidelines for employer-intern relations and to help interns and employers understand their rights and responsibilities.

There is little doubt that internships can provide valuable experience and vocational training at the start of a person’s career or during a transition, but it is crucial that employers meet the legislative criteria and refrain from misusing their interns. It is not only an issue of pay, but also working conditions. A person who does not qualify as an employee may not be protected by other employment standards such as those respecting workplace health and safety. This is unacceptable.

The enforcement blitz in unpaid internships continues, and we look forward to seeing the overall results after June 2014, and how the ministry intends to respond.

Categories: Teknoids Blogs

Gather All Ye Faithful

slaw - Thu, 04/10/2014 - 06:00

Many clergy have complained of contemporary society’s loss of faith. Attendance at religious service is down. Faith in the Almighty is considered quaint, antiquated or – by the more rabidly atheist – downright stupid and offensive. Yet rare is the church where doomsday promises of Armageddon-induced hellfire have sparked a mass return to the foot of the altar. I therefore find it peculiar when Federal Justice Minister, Peter MacKay, bemoans Canadians’ loss of faith in the criminal justice system while in the same breath repeating his oft-made promise to rain a fury of new tough-on-crime hail from on-high upon the land.

In a recent speech to University of Calgary law students, the Minister opined on the sensitive and difficult subject of dealing with the not-criminally-responsible (NCR) mentally ill accused saying, “We must ask ourselves some fundamental questions about our profession. Do Canadians have faith in the justice system, and what changes must we make to increase public faith and confidence in our system?” Does a system that reviews those incarcerated for mental-illness induced crimes every three years rather than the previous annual assessment inspire increased public confidence? When an NCR offender is deemed psychiatrically fit to begin reintegration back into society, does a further period of forced custody send a message of faith and confidence in our justice system or does it instead proclaim the triumph of fear and vengeance over reason and compassion?

It is hard to take seriously complaints of a loss of faith in the justice system from a government that has brought us its fair share of justice-related gaffes. Who can forget Vic Toews as Public Safety Minister (now His Honour Justice Toews of the Manitoba court of Queens Bench) suggesting that Canadians had to choose between granting authorities easier access to personal telecommunication information or supporting child pornographers? I didn’t feel my faith in the justice system swelling at those comments.

Did you feel a sudden restoration of faith in justice when Stockwell Day cited the statistic of rising unreported crime (!) to justify a $5.1 billion expansion of our Federal prison system? Sure, reported crime is down but what about all those crimes happening across the country that nobody wants to tell police about? We in the Government have good reason to believe that imaginary crime is waaaay up and we need a massive expansion of our federal prison system so that we can put all those unreported criminals into jail longer. Faith and confidence indeed.

Recent amendments forcing Judges to impose $100+ “surcharges” on all offenders regardless of their ability to pay or mental capacity don’t speak to efforts to bolster faith in a well-reasoned justice system capable of balancing the legitimate concerns of victims with the realities of financially destitute offenders. Ordering a schizophrenic street kid to pay $100 when convicted of mischief to property delivers no message of faith and confidence. It undermines sanity and logic, bringing the administration of justice into disrepute.

Our Government regularly complains about activist out-of-touch judges meddling into Parliament’s bailiwick by tossing out mandatory minimum sentences or invalidating Criminal Code amendments, yet over a decade in power it is that same Conservative Government that has appointed every Superior Court judge across the country. When an ‘activist’ Supreme Court unanimously kicked Canada’s entire prostitution legislation to the curb, five out of the nine Justices were Harper appointees. Is it possible that maybe, just maybe, even Conservative jurists think the tough-on-crime agenda is failing Canadians?

Perhaps if the claims of a widespread loss of faith in the criminal justice system are legitimate, a good hard look in the mirror is called for. I echo Minister MacKay’s desire to shore up the faith of Canadians in our justice system but I choose to support that laudable goal in a much different fashion.

Rather than disparaging with politically motivated sound bites the very same judges our government continues to appoint, the Minister could lead by example by demonstrating a desire to pay heed to the increasing messages from the bench decrying mandatory minimum penalties and uncollectable surcharges.

The time has come to support not just the principle of open courts but the implementation of it by encouraging (or perhaps even legislating) the introduction of cameras in our nations’ courtrooms. Canadians should be encouraged to watch not only the scandalous and salacious cases that inhabit our courtrooms from time-to-time but the daily drudgery of human misery tinged with a hope for the future that shuffles through Provincial plea courts from coast-to-coast.

Call out the system’s failures but celebrate its successes with equal fervor. If encouraging faith in the justice system were genuinely a priority for Mr. MacKay, he could start by reminding Canadians that crime across nearly every statistic is down meaningfully year over year every year for decades with a particularly stark decrease noted just in 2013 alone. Despite the numbers, our Federal Government continues to resort to bombastic fear mongering in decrying the evil that purportedly lurks around every corner. Canadians from St. Johns to Vancouver of all ages and socio-economic stripes are safer today than they have ever been in recent recorded history. They are less likely to be victims of crime than in nearly any other country on earth. That’s a system that, despite its warts and bruises, is worthy of the public’s faith and confidence.

Categories: Teknoids Blogs

Thursday Thinkpiece: Hughes and Bryden on the Test for Judicial Disqualification

slaw - Thu, 04/10/2014 - 06:00

Each Thursday we present a significant excerpt, usually from a recently published book or journal article. In every case the proper permissions have been obtained. If you are a publisher who would like to participate in this feature, please let us know via the site’s contact form.

Refining the Reasonable Apprehension of Bias Test: Providing Judges Better Tools for Addressing Judicial Disqualification
Jula Hughes & Philip Bryden
36:1 Dalhousie Law Journal (2013) 171-192

Introduction

The “reasonable apprehension of bias” test for judicial disqualification has been a fixture of Canadian law for many years, at a minimum since its formulation in the National Energy Board case in 1978.[1] By that time, the Supreme Court of Canada was able to draw on a long history of Canadian and other common law precedents in support of identically or similarly framed tests for determining judicial impartiality. Despite a considerable amount of litigation concerning judicial impartiality since that time, the test itself has remained fundamentally unaltered and is well accepted in the jurisprudence. Unfortunately, the application of the test continues to generate difficulties for judges who need to use it to make decisions in marginal cases.[2]

Our experience as facilitators in judicial education seminars over the years led us to believe that judges often have very different views about how the “reasonable apprehension of bias” test should be applied. In particular, we noticed that even when the case law indicated that it was not necessary for a judge to recuse himself or herself in a particular situation, it was quite common for judges participating in seminars to suggest that the application of the general reasonable apprehension of bias test would lead them to a different conclusion. We hypothesized that these differences of opinion might not appear in the reported case law because in many instances judges recuse themselves of their own motion, with the result that the case law significantly under-reports the incidence of recusal. With the assistance of the Canadian Association of Provincial Court Judges, we sought to develop a better understanding of how judges think about recusal and disqualification by conducting a survey of 137 Canadian provincial and territorial judges concerning their experience of and attitudes toward recusal in analytically marginal cases. . . .

. . . . .

II. Why should we care about variability in the application of the reasonable apprehension of bias test?

A person who was skeptical about modifying the “reasonable apprehension of bias” test might well argue that there is nothing special about judges having differences of opinion in marginal cases. What makes a case marginal, after all, is our ability to make plausible arguments for or against a particular result. The use of a “reasonableness” standard implies a requirement that a range of considerations will come into play in different contexts, and over time the common law method has allowed judges using the “reasonableness” standard to develop a coherent jurisprudence in a variety of different areas of the law.[12] Why, the skeptic might ask, should the law with respect to judicial impartiality be any different?

We offer two responses to the skeptic’s query. One is to observe that refinement of the tests judges use to make decisions is an important part of the common law method. If we are able to refine the reasonable apprehension of bias test in a manner that helps to explain the jurisprudence, and make it more coherent and easier for judges to apply consistently, even a skeptic ought to agree that the effort is worthwhile. We would argue that the level of disagreement among judges concerning the application of the test in marginal cases that we have identified suggests that there is considerable room for improvement. We acknowledge that we still need to demonstrate to the skeptic that we have succeeded in our attempt to improve upon the “reasonable apprehension of bias” test.

The second response is that disagreements about whether judges should or should not recuse themselves have a meaningful impact on the administration of justice, and any improvement we can offer in this area will benefit both the judges and the members of the public who use our court system. The point of departure for this argument is that recusal is sufficiently frequent to have an impact on the effective and efficient administration of justice. As we noted above, in “Tip of the Iceberg,” two- thirds of our respondents indicated that they would recuse themselves between one and five times in a typical year. Another nineteen per cent reported recusing more than five times a year. Only fourteen per cent indicated that they would not recuse themselves at all in a typical year.[13]

The reported case law on recusal, while extensive, does not begin to reflect the actual scope of recusal as the vast majority of recusal decisions are made by judges on their own motion. This was true for the respondents to our survey,[14] and judging from comments of participants at judicial education seminars, it is true for judges at all levels of Canadian courts. It is difficult to estimate the global impact of recusal on the administration of justice as the degree of disruption to the work schedule of the court will vary depending on the stage in the proceedings at which recusal takes place, the context in which the judge works, and the particularities of the case. The same factors will influence the extent to which recusal has an impact on the parties. In most instances a decision not to sit taken prior to the docket being finalized has a very different impact on the parties than a decision to recuse half-way through a trial. The practical consequences for the parties of the recusal of the sole judge in a small town or a judge on circuit are likely to be different than the impact of recusal by a judge in a major centre. The inherent delay that attends many recusal decisions may affect one party more than the other. Similarly, delay in some types of proceedings may have more serious consequences for the parties than in other situations.

Our research also suggests that judges recuse in some marginal situations where jurisprudence and/or policy would suggest that they should not recuse.[15] This may be because judges are apprehensive about appellate treatment or because they do not wish to sit when a party has expressed a lack of confidence in their impartiality.[16] Our research further indicates that certain types of situations are subject to a great deal of uncertainty and disagreement between judges.[17] Greater clarity would likely promote efficiency by avoiding unnecessary recusals. It would also enhance the transparency of judicial practice with respect to recusal and would encourage consistency.

_______________________

1. Committee for Justice and Liberty et al v National Energy Board et al, [1978] 1 SCR 369 at 394 [National Energy Board]. It is worth noting that the National Energy Board case itself did not involve the disqualification of a judge but the disqualification of a member of an administrative tribunal. The Nation Energy Board test is, however, employed in the leading Supreme Court of Canada decisions involving challenges to the impartiality of judges. See, for example, Wewaykum Indian Band v Canada, 2003 SCC 45, [2003] 2 SCR 259 at para 60 [Wewaykum]; R v RDS, [1997] 3 SCR 484 at para 31, per L’Heureux-Dubé and McLachlin JJ; and para 111, per Cory J.

2. Lorne Sossin has observed that “[t]he law in Canada relating to judicial bias is at once clear and unsettled. It is clear because for purposes both of legal and ethical accountability, the standard of impartiality which has been adopted, that of reasonable apprehension of bias, has found widespread acceptance. It is unsettled because the application of that standard remains very much in flux”: “Judges, Bias and Recusal in Canada” in Hoong Phun Lee, ed, Judiciaries in Comparative Perspective (Cambridge: Cambridge University Press, 2011) 301 at 321.

. . . . .

12. Familiar examples include the reasonable person standard of care in negligence law and the reasonable person standard in the law of self-defence in criminal law. See, for example, Arland v Taylor, [1955] OR 131 (CA); R v Cinous, [2002] 2 SCR 3; and R v Lavallee, [1990] 1 SCR 852.

13. Bryden & Hughes, “Tip of the Iceberg,” supra note 3 at 576-577.

14, Ibid at 572.

15. Ibid at 584-585 and 604-608.

16. There is a significant strand of American thinking on judicial disqualification that favours peremptory challenges where a party subjectively believes that a judge will not be impartial. See Charles Geyh, “Draft Report of the ABA Judicial Disqualification Project” (2008), online: at 60-65; James Sample & Michael Young, “Invigorating Judicial Disqualification: Ten Potential Reforms” (2008) 92 Judicature 26 at 27-28; Debra Lyn Bassett, “Judicial Disqualification in the Federal Appellate Courts” (2002) 87 Iowa Law Review 1213 at 1224 and 1251-1256; Richard Flamm, Judicial Disqualification: Recusals and Disqualification of Judges, 2d ed (Berkeley, CA: Banks and Jordan Law, 2007) at § 26.1.

17. Bryden & Hughes, “Tip of the Iceberg,” supra note 3 at 579-582.

Categories: Teknoids Blogs

Project to create free and open online version of New York City Legal Code

Legal Informatics Blog - Thu, 04/10/2014 - 04:36

Dazza Greenwood of MIT Media Lab, the MIT computational legal science research group, New York City Council Member Ben Kallos, and Thom Neale say they have begun a project to create a free and open version of the New York City Legal Code.

They are holding a Google + Hangout to discuss this project on 17 April 2014.

Here is the description of the project, from the hangout page:

Collaboration With MIT Media Lab #LegalScience Team and New York City Council Member Ben Kallos in Support of Kallos’ Efforts to Put New York City Legal Code Online in Data-Friendly and REST-Accessible Manner and Related Research, Projects and Activities.

Thom Neale said today that he has been working on this project, and that later this year he plans to convert the data to the format used by State Decoded platform, and to involve the OpenGov Foundation in the project.


Filed under: Applications, Data sets, Meetings, Projects, Technology developments Tagged: #legalscience, America Decoded, AmericaDecoded, Ben Kallos, Dazza Greenwood, eparticipation systems, Free access to law, Legislative information systems, MIT computational legal science research, MIT Media Lab, Municipal codes, Municipal law information systems, Municipal ordinances, New York City Legal Code, New York City municipal code, New York Municipal Code, Open legal data, Open legislative data, OpenGov Foundation, OpenLegalCode Collaboration, OpenLegalCode Collaboration Call, Public access to legal information, RESTful APIs and legal information systems, State Decoded, The State Decoded, Thom Neale
Categories: Teknoids Blogs