Upload
others
View
13
Download
0
Embed Size (px)
Citation preview
Patent Drafting for Machine Learning:
Structural Claim Limitations,
Avoiding §101 or §112 Rejections
Today’s faculty features:
1pm Eastern | 12pm Central | 11am Mountain | 10am Pacific
The audio portion of the conference may be accessed via the telephone or by using your computer's
speakers. Please refer to the instructions emailed to registrants for additional information. If you
have any questions, please contact Customer Service at 1-800-926-7926 ext. 1.
TUESDAY, FEBRUARY 13, 2018
Presenting a live 90-minute webinar with interactive Q&A
Gregory Rabin, Patent Attorney, Schwegman Lundberg & Woessner, New York
Michael D. Stein, Partner, Baker & Hostetler, Seattle
Tips for Optimal Quality
Sound Quality
If you are listening via your computer speakers, please note that the quality
of your sound will vary depending on the speed and quality of your internet
connection.
If the sound quality is not satisfactory, you may listen via the phone: dial
1-866-819-0113 and enter your PIN when prompted. Otherwise, please
send us a chat or e-mail [email protected] immediately so we can address
the problem.
If you dialed in and have any difficulties during the call, press *0 for assistance.
Viewing Quality
To maximize your screen, press the F11 key on your keyboard. To exit full screen,
press the F11 key again.
FOR LIVE EVENT ONLY
Continuing Education Credits
In order for us to process your continuing education credit, you must confirm your
participation in this webinar by completing and submitting the Attendance
Affirmation/Evaluation after the webinar.
A link to the Attendance Affirmation/Evaluation will be in the thank you email
that you will receive immediately following the program.
For additional information about continuing education, call us at 1-800-926-7926
ext. 2.
FOR LIVE EVENT ONLY
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Patent Drafting for Machine
Learning: Structural Claim
Limitations, Avoiding 101 or 112
Rejections
Greg Rabin
February 13, 2018
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Introduction
Greg Rabin
www.linkedin.com/in/gregrabin
Patent attorney at Schwegman, Lundberg, and Woessner, PC, based in Ithaca, NY.
5
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Subject of this Talk
Anticipating and Avoiding 101 Rejections for Machine Learning
6
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Disclaimer
• The subject matter of this presentation is the speaker’s opinion only, and does not necessarily correspond to the opinions of the speaker’s employer or clients.
• The information in this presentation may not correspond to constantly changing information/ guidance provided by the courts or the USPTO.
7
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
What is Artificial Intelligence?
• Artificial intelligence (AI) is the ability of a computer program or a machine to think and learn. It is also a field of study which tries to make computers “smart.”
• Example use cases include: understanding human speech, playing strategic games (e.g., chess), self-driving cars, and interpreting complex data.
• An extreme goal of AI is to create computers that can learn, solve problems, and think logically.
• AI involves a combination of many different fields, such as computer science, mathematics, linguistics, psychology, neuroscience, and philosophy.
8
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Long-Term Problems in AI
• Creating a “general artificial intelligence,” which can solve many different problems, instead of focusing on just one.
• Including making new discoveries or inventions???
• Developing a machine capable of “perceiving its environment” and making decisions or taking actions accordingly, like a human.
9
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
AI Sub-Field
Search and Optimization
• Intelligently searching through many possible solutions, taking advantage of machine’s high processing speed.
10
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
AI Sub-Field
Neural Networks
• Neural networks are modeled after the neurons in the human brain, where a trained algorithm determines an output response for input signals. Multiple layers process an input, with each layer applying more and more complex processing.
11
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
AI Sub-Field
Control Theory
• Control of continuously operating dynamical systems in engineered processes and machines. The objective is to develop a control model for controlling such systems using a control action in an optimum manner without delay or overshoot and ensuring control stability.
12
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
What is Machine Learning?
• Machine Learning gives “computers the ability to learn without being explicitly programmed.”
• Machine learning explores the study and construction of algorithms that can learn from and make predictions on data – such algorithms overcome following strictly static program instructions by making data-driven predictions or decisions, through building a model from sample inputs.
• Example applications include email filtering, detection of network intruders, optical character recognition (OCR), and computer vision.
• Source: Wikipedia: Machine Learning [https://en.wikipedia.org/wiki/Machine_learning]
13
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Types of Machine Learning
Supervised Learning: The computer is presented with example inputs and their desired outputs, given by a “teacher,” and the goal is to learn a general rule that maps inputs to outputs.
– E.g. The animals in the picture are labeled as “cat” or “dog”, and the computer trains itself to identify other animals, in other pictures, as either “cat” or “dog”.
14
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Types of Machine Learning
Unsupervised Learning: No labels are given to the learning algorithm, leaving it on its own to find structure in its input. Unsupervised learning can be a goal in itself (discovering hidden patterns in data) or a means towards an end (feature learning).
– E.g. Classify the animals below into different types.
15
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Types of Machine Learning
Reinforcement Learning: A computer program interacts with a dynamic environment in which it must perform a certain goal (such as driving a vehicle or playing a game against an opponent). The program is provided feedback in terms of rewards and punishments as it navigates its problem space.
– E.g. Learn to drive…
16
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. A
1. A method comprising: accessing, at one or more computing devices, an image; determining whether the accessed image includes a cat; and providing an output indicating whether the accessed image includes the cat.
17
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. A
This claim is directed to cat-recognition technology. It is likely invalid under 35 U.S.C. 101 because it preempts all other cat-recognition technologies. Every algorithm that identifies cats will access an image, determine whether the image includes a cat, and provide a corresponding output.
18
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. A
This claim might become patentable by…
• Adding claim limitations related to training the machine learning (Ver. B).
• Adding claim limitations related to the feature vector used in the machine learning (Ver. C).
• Adding claim limitations related to the output of the machine learning (Ver. D).
19
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. B
1. A method comprising: accessing, at one or more computing devices, an image; determining, using an image classification engine stored in one or more memories of the one or more computing devices, whether the accessed image includes a cat, the image classification engine being trained, using unsupervised learning, to classify images into a plurality of classes and being notified, via a graphical user interface (GUI), which classes correspond to cats, the GUI displaying at least one image from each class of the plurality of classes; and providing an output indicating whether the accessed image includes the cat.
20
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. B
The method claim in Version B does not preempt all use of “cat-recognition technology.”
Rather, the method claim details a specific technique by which the one or more computing devices are trained to recognize cats.
21
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. B
Other inventors could possibly invent other “cat-recognition” techniques that include different operations/ components from those listed in Version B of claim 1, for example, supervised learning or reinforcement learning could be used.
22
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. B
It should be noted that the Version B claim might still be invalid and some Examiners may insist on more implementation details being added to the claim before finding it patentable under 35 U.S.C. 101.
23
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. C
1. A method comprising: accessing, at one or more computing devices, an image; determining, based on a feature vector comprising a plurality of features that are extracted from the accessed image, whether the accessed image includes a cat, the plurality of features comprising at least an identification of a portion of the accessed image as representing a paw, a symmetry of the paw, and a shape of the paw; and providing an output indicating whether the accessed image includes the cat.
24
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. C
The method claim in Version C does not preempt all use of “cat-recognition technology.”
Rather, the method claim details a specific technique by which the one or more computing devices recognize cats using explicitly specified features of a feature vector.
25
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. C
Other inventors could possibly invent other “cat-recognition” techniques that include different operations/ components from those listed in Version C of claim 1. For example, features other than the paw can be used to recognize the cat.
26
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. C
It should be noted that the Version C claim might still be invalid and some Examiners may insist on more implementation details being added to the claim before finding it patentable under 35 U.S.C. 101.
27
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. D
1. A method comprising: accessing, at one or more computing devices, an image; determining whether the accessed image includes a cat; and providing an output indicating whether the accessed image includes the cat, the output comprising overlaying the accessed image with a first text if the accessed image includes the cat, and overlaying the accessed image with a second text, different from the first text, if the accessed image does not include the cat.
28
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. D
The method claim in Version D does not preempt all use of “cat-recognition technology.”
Rather, the method claim details a specific technique by which the one or more computing devices output information representing that a cat was recognized in an image.
29
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. D
Other inventors could possibly invent other “cat-recognition” techniques that include different operations/ components from those listed in Version D of claim 1, for example, the output could be presented in a different format.
30
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Example ML Claim Ver. D
It should be noted that the Version D claim might still be invalid and some Examiners may insist on more implementation details being added to the claim before finding it patentable under 35 U.S.C. 101.
31
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
A Real-Life Example from Public PAIR
ADDITIONAL DISCLAIMER
The real-life example described here is not, in any way, representative of a 101 trend. The speaker makes no guarantees that this claim, or any similar claim, will be upheld in litigation, in review by the Patent Office, or in another patent application.
The real-life example is only one example of a published patent. The slides contain excepts from the file wrapper history and do not include the entire history.
32
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
A Real-Life Example from Public PAIR
The real-life example corresponds to a US patent application, filed in February 2016, which lead to a patent that issued in August 2017.
The real-life example relates to uses of machine learning in a question answering system.
33
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
A Real-Life Example from Public PAIR
Note: All data used in this presentation related to the above-referenced patent application is copied from Public PAIR, patft.uspto.gov, patents.google.com, and other publicly accessible sources. The speaker and the speaker’s employer were not involved in the drafting or prosecution of the above-noted patent application.
34
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Claim Rejected Under 35 U.S.C. 101
1. A method, in a data processing system, for question answering using multi-instance learning, the method comprising: training an answer ranking multi-instance learned model using a ground truth question and answer-key pair set; receiving an input question from a user; generating one or more candidate answers to the input question, wherein each of the one or more candidate answers has an associated set of supporting passages; determining a confidence value for each of the one or more candidate answers using the answer ranking multi-instance learned model based on the set of supporting passages; ranking the one or more candidate answers by confidence value to form a ranked set of answers; and presenting a final answer from the ranked set of answers, the confidence value for the final answer, and supporting evidence for the final answer to the user.
35
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Amendment Leading to Patent-Eligibility Under 101
1. A method, in a data processing system comprising at least one processor and at least one memory, the at least one memory comprising instructions executed by the at least one processor to implement a question answering system, for question answering using multi-instance learning, the method comprising: training an answer ranking multi-instance learned model using a ground truth question and answer-key pair set, wherein the data processing system executes in accordance with the answer ranking multi-instance learned model to implement the question answering system, wherein the answer-ranking multi-instance learned model is trained based on whether a passage set returned for each answer in the question and answer-key pairs set collectively does or does not support a correct answer; receiving, by the question answering system, an input question from a user;
36
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Amendment Leading to Patent-Eligibility Under 101
generating, by a hypothesis generation phase of a question answering pipeline of the question answering system, one or more candidate answers to the input question, wherein each of the one or more candidate answers has an associated set of supporting passages; determining, by a hypothesis and evidence scoring phase of the question answering pipeline of the question answering system, a confidence value for each of the one or more candidate answers using the answer ranking multi-instance learned model based on the set of supporting passages; ranking, by an answer ranking phase of the question answering pipeline of the question answering system, the one or more candidate answers by confidence value to form a ranked set of answers; and presenting, by the question answering system, a final answer from the ranked set of answers, the confidence value for the final answer, and supporting evidence for the final answer to the user.
37
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Result of this Amendment
• 35 U.S.C. 101 (patent-eligibility) rejection was withdrawn.
• 35 U.S.C. 102 (novelty) rejection was withdrawn.
• A new rejection under 35 U.S.C. 103 (non-obviousness) was entered.
• The Applicants eventually overcame the 35 U.S.C. 103 rejection, and a US Patent was issued in August 2017.
38
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Discussion of Claim Amendments
The response states: “Applicants… amend claim 1… to more specifically recite that the data processing system executes in accordance with an answer ranking multi-instance learned model to implement the question answering system and to recite a hypothesis generation phase, a hypothesis and evidence scoring phase, and an answer ranking phase of a question answering pipeline of the question answering system… Therefore, independent claim 1… recites a specific configuration of, and improvement to, a data processing system.”
39
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Discussion of Claim Amendments
The amended claim specifically recites the training mechanism used, and the phases for generating the answer. (Similar to Version B of the cat-recognition claim).
Note that the amended claim does not preempt all “question answering” solutions because other solutions could use a different training mechanism and, possibly, different phases for generating the answer.
40
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Take Away Points
Claiming features related to the machine training mechanism, feature extraction, or output formats may be sufficient to overcome a 35 U.S.C. 101 rejection for a machine learning-related patent application.
However, nothing is guaranteed, and it is necessary to consider the specific features of any given invention before deciding which claim amendments or arguments would best advance prosecution in view of a 35 U.S.C. 101 rejection.
41
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
101 Rejections by Art Unit
101/Alice rejections are more commonly issued in some technology centers/ art units than in others. One strategy may be to draft a patent application to encourage or avoid its entry into one of the technology centers/ art units.
Source: http://www.ipwatchdog.com/2015/12/14/the-most-likely-art-units-for-alice-rejections/
42
Copyright 2017 Schwegman Lundberg & Woessner. P.A. All Rights Reserved.
Conclusion
Hopefully, in the near future, the USPTO will issue clear guidelines regarding which software (and machine learning) inventions are patentable under 35 U.S.C. 101 in view of Alice and the related cases.
43
Protecting Software Inventions
Including AI and ML Algorithms
Michael D. Stein,
BakerHostetler LLP
Presented for Strafford Patent Drafting for Machine Learning: Structural Claim Limitations, Avoiding 101 or 112 Rejections Feb. 13, 2018
Outline
1. Patenting AI/ML Inventions
a. Software patents and functional claim
limitations
b. Structural claim limitations for software
c. Inventorship questions
2. Trade Secret Protection
3. Strategic Use of Patent and Trade Secret
Protection
46
I. Problem Overview
47
Litigation USPTO
Legal Solutions: Use § 101 to invalidate overly
broad ( abstract ) claims (Alice/Mayo)
Use § 112(b) to invalidate unclear claims (Aristocrat)
Alternative Solution: Include structural
limitations in claims for software-related inventions.
Applicants and USPTO ensure patent file history reflects examiner s consideration of § 101 and § 112 issues.
Provide context for courts.
Problem: Courts lack guidance/procedure
to effectively apply the law. Courts lack context for
evaluating claims from perspective of POSITA.
Too many patents are being invalidated on incorrect grounds!
Background
Critics argue:
• Patents do less good and cause more harm in the software industry than in other industries.
• The aggregation of overbroad and unclear claims can result in innovation-stifling patent thickets.
• Software patents are not necessary to spur innovation because software innovation is less costly than innovation in the life sciences.
48
USPTO-led Executive Action for Clarity in
Patent Claims
49
Why functional claim limitations
are used in software patents
Why functional claiming is so common in software patents:
1. Software structure and function can be separated. A software developer can write new software without knowing details of the hardware.
2. The software industry lacks a commonly accepted “vocabulary” for defining software elements. In software, a broad claim requires defining the invention at a higher level of abstraction. Software developers coin new terms to define the functional elements of the software, and the meaning and scope of such new terms is often not explicitly defined.
50
Functional claim limitations
and 35 USC §112(f)
51
35 U.S.C. 112 Specification.
. . .
(f) ELEMENT IN CLAIM FOR A COMBINATION.—
An element in a claim for a combination may be
expressed as a means or step for performing a
specified function without the recital of structure,
material, or acts in support thereof, and such claim
shall be construed to cover the corresponding
structure, material, or acts described in the
specification and equivalents thereof.
Aristocrat Techs. Australia Pty Ltd, v. Int’l Game
Tech., 521 F.3d 1328 (Fed. Cir. 2008)
52
Claim Disclosed Structure
53
Data structures Program structures
Structural
claim
limitations
for software
Describing computer software in
terms of “structure”
54
Essential feature(not shown): Relationship
between the executable software and
hardware processor(s).
• The executable code is not an
unconstrained, abstract set of commands
written down by the programmer.
• The executable code must be committed to
memory using “machine codes” selected
from the specific machine language
instruction set, or “native instructions,”
designed into the hardware processor.
• The native instruction set is known to, and
essentially built into, the hardware
processor(s).
• Each native instruction is a discrete code
recognized by the processing architecture
and can specify particular registers for
arithmetic, addressing, or control functions.
• Complex operations are built up by
combining the simple native instructions.
Describing computer software in
terms of “structure”
55
Key ideas:
• Relationship between executable
software instructions and
hardware processor constrains
the claimed system.
• Problems related to abstractness,
indefiniteness, and overbreadth
can be addressed by describing
inventive computer software
elements in the specification at
this level of detail, and including
appropriate limitations in the
claims.
• I do not advocate inclusion of
specific machine code or even
source code listings in patent
applications.
Example: Artificial Neural Network
56
Example claim to computer-implemented system
comprising an ANN
1. A classification system, comprising:
a processor configured to execute instructions programmed using a predefined set of machine
codes; and
an artificial neural network (ANN) comprising:
first, second, and third input nodes, wherein each input node includes a memory location for
storing an input value;
first, second, third, and fourth hidden nodes, wherein each hidden node is connected to each
input node and includes computational instructions, implemented in machine codes of the
processor, for computing first, second, third, and fourth output numbers, respectively; and
first and second output nodes, wherein the first output node includes a memory location for
storing a first output signal indicative of a first classification, and the second output node
includes a memory location for storing a second output signal indicative of a second
classification.
57
Example: Linked List Data Structure and
Linear Search Algorithm
58
Boolean logic expressions and operators: Logical operands and logical operators are combined together. Operands
are statements (that can be proven True or False) and the operators are logical AND, OR and NOT. Boolean expressions
involve comparison operators that can be evaluated to determine if they are True or False. Comparison operators
include: = < > ≥ ≤ ≠.
Expressions combining Boolean operators and comparison operators are written and evaluated in computer
programming languages using control statements that control which sections of code in a program are executed.
Decision or Conditional statements are a type of Control statement and are often referred to as IF..THEN..ELSE
statements.
Iterative constructs, or Loops, allow a section of code to be repeated. There are several variations to iterative or
looping constructs but for the most part they fall into two categories: FOR loops and WHILE/DO loops.
Pseudocode for a linear search algorithm implemented with Boolean logic and FOR and IF..THEN loops:
procedure linear_search (list, value)
for each item in the list
if match item == value
return the item's location
end if
end for
end procedure
Example claim to computer-implemented search
system comprising a linked list data structure
59
1. A search system, comprising:
a processor configured to execute instructions programmed using a predefined set of machine codes;
memory coupled to the processor;
a linked-list data structure implemented in said memory, comprising a first node, a plurality of intermediate nodes, and a terminator;
wherein the first node is connected to a first intermediate node;
wherein each of intermediate nodes except for a last intermediate node is connected to a subsequent one of the intermediate nodes; and
wherein the last intermediate node is connected to the terminator; and
a linear search algorithm implemented in machine codes for the processor.
2. The system of claim 1, wherein the linear search algorithm comprises an iterative construct and a Boolean logic expression.
60
PTAB Example: Ex parte Boucher (Dec. 8, 2017)
1. A method for a computer for assisting the maintenance of a system of an aircraft comprising … a diagnostic device that performs computer-readable instructions using a processor for executing the steps of the method comprising:
providing an onboard aircraft database in the aircraft, …
detecting, using the detector, a fault of the system of the aircraft;
identifying… said each item of the list of said at least one item being liable to be the cause of said detected fault…
determining … said probability of the corresponding cause of the identified fault …
identifying … said at least one item of said equipment of the aircraft actually being the cause of said detected fault;
updating… an onboard temporary database on a basis of said fault identifier …
sending … a content of said onboard temporary database to said remote ground database …
erasing … said onboard temporary database upon completion of the update …
determining … said probability of the corresponding cause of the identified fault …; and
storing both of the onboard aircraft and temporary databases in the system of said aircraft.
61
PTAB Example: Ex parte Blackwell (Nov. 2, 2017)
1. A method comprising:
providing search results in a first dedicated screen space of a user interface and associated with a first collection
based on at least one search term, and which excludes search results associated with any of one or more second
collections;
providing a separate customizable preview of search results … in a separate pane located in a second dedicated
screen space of the user interface …; and
providing options to a user to customize the second dedicated space and the separate pane …in the user interface
that is configured to provide customization controls which:
allow the user to remove the separate pane in the second dedicated space;
order the separate pane with at least one other customizable preview pane in the second dedicated space; and
allow the user to add another pane in the second dedicated space …,
wherein the first collection is a first category of information for the at least one search term and the at least one
second collection includes a second category of information for the at least one search term which is different from
the first category of information.
62
Inventorship Questions
Who is the inventor?
• 35 U.S.C. 100 (definitions) – (f) The term "inventor" means the individual or, if a joint
invention, the individuals collectively who invented or discovered the subject matter of the invention.
– (g) The terms "joint inventor" and "coinventor" mean any 1 of the individuals who invented or discovered the subject matter of a joint invention
• 35 U.S.C. 115 (Inventor’s oath or declaration) – (a) NAMING THE INVENTOR; … each individual who is
the inventor or a joint inventor of a claimed invention in an application for patent shall execute an oath or declaration …
63
Inventorship Questions
Improper Naming of Inventors
• MPEP 2157 Improper Naming of Inventors – Although the AIA eliminated pre-AIA 35 U.S.C. 102(f), the patent laws
still require the naming of the actual inventor … The Office presumes that the named inventor or joint inventors in the application are the actual inventor or joint inventors …
• 37 CFR 1.56 Duty to disclose information material to patentability – … the Office is aware of and evaluates the teachings of all information
material to patentability. Each individual associated with the filing and prosecution of a patent application has a duty of candor and good faith in dealing with the Office, which includes a duty to disclose to the Office all information known to that individual to be material to patentability as defined in this section.
Duty to Disclose “Information” on Inventorship
• MPEP 2001.04 Duty of Disclosure, Candor, and Good Faith In addition to prior art such as patents and publications, 37 CFR 1.56 includes, for example, information on enablement, possible prior public uses, sales, offers to sell, derived knowledge, prior invention by another, inventorship conflicts, and the like.
64
What Is a Trade Secret?
A trade secret is information ( formula, , pattern, compilation, program, device, method, technique, or process) that:
(1) derives economic value from not being generally known or readily ascertainable by proper means; and
(2) is the subject of "reasonable efforts" under the circumstances to maintain its secrecy.
Trade Secrets Can Be Used As Sword and Shield
Defend Trade Secrets Act (2016)
DTSA created a private, federal, civil right of action for alleged victims of trade secrets theft, as well as a uniform federal definition of “trade secret” applicable in federal criminal and civil cases.
The DTSA’s definition of “trade secret” appears substantively identical to the UTSA
35 U.S.C. §273 (prior commercial use defense)
In the presence of clear and convincing evidence that it had engaged in commercial use of the patented subject matter at least date, the “prior user” would be allowed free and clear historical and future use of that patented subject matter
65
Trade Secret Protection
66
Strategic Use of Patent & Trade Secret
Protection
Patent Protection Trade Secret Protection
ML Joke
A ML algorithm walks into bar…
• Bartender: What are you having?
• Algorithm: What is everyone else having?
67