JWT

JWT:https://jwt.io/

eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpXVCBtYWRlIGVhc3kiLCJhZG1pbiI6dHJ1ZX0.RhS5_R99IA0u_UffKr8xDh05Ob9Lb-kOBlmOWlspcc0

Header
{
“alg”: “HS256”,
“typ”: “JWT”
}

Payload
{
“sub”: “1234567890”,
“name”: “JWT made easy”,
“admin”: true
}

verify signature
HMACSHA256(
base64UrlEncode(header) + “.” +
base64UrlEncode(payload),
secret
)

How does JWT work
Client -> Server
$curl -u user http://127.0.0.1/login
$curl -H “Authorization: Bearer ” http:/127.0.0.1/secure

Microservices

puppet, chef, ansible

The term “Microservice Architecture” has sprung up over the last few years to describe a particular way of designing software applications as suites of independently deployable services. While there is no precise definition of this architectural style, there are certain common characteristics around organization around business capability, automated deployment, intelligence in the endpoints, and decentralized control of languages and data.

Modular, Easy to deploy, Scale independentry
Micro services Design pattern, applies to any application, rapid deployment, continuous delivery

gloud compute zones list
gcloud config set compute/zone europe-west1-d

– 12factor
codebase
dependencies
config
backing services
build, release, run
execute the app as one or more stateless processes
port binding
concurrency
disposability
dev/prod parity
logs
admin processes

[vagrant@localhost app4]$ go build -o bin/hello ./hello/

Segment

code here

def splits(characters, longest=12):
	"All ways to split characters into a first word and remainder."
	return [(characters[:i], characters[i:])
		for i in range(1, 1+min(longest, len(characters)))]

def Pwords(words):
	"probability of a sequence of words."
	return product(words, key=Pw)

@memo
def segment(text):
	"Best segmentation of text into words, by probability."
	if text == "": return []
	candidates = [[first]+segment(rest) for first,rest in splits(text)]
	return max(candidates, key=Pwords)

spelling correction
c* = argmaxc P(c|w)
= argmaxc P(w|c) p(c)

Natural Language processing

Language Model
-probabilistic
-word-based
-Learned

P(word1, word2…)
L = {s2, s2, …}

Logical trees hand-coded

P(w1, w2, w3…Wn) = p(w1:n) = πi P(wi|w1:i-1)
Markov assumption
P(wi|w1:i-1) = P(wi|wi-k:i-1)

stationarity assumption p(wi|wi-1) = p(wj|wj-1)
smoothing

classification, clustering, input correction, sentiment analysis, information retrieval, question answering, machine translation, speech recognition, driving a car autonomously

P(the), p(der), p(rba)

Naive Bayes
k-Nearest Neighbor
Support vector machines
logistic regression
sort command
gzip command

(echo ‘cat new EN|gzip|wc -c’ EN;\
echo ‘cat new DE|gzip|wc -c’ DE; \
echo ‘cat new AZ|gzip|wc -c’ AZ) \
|sort -n|head -1

S* = maxP(w|:n) = max√li p(wi|w1:i-1)
S* = maxP(wi)

Robotics

IMU, 6 computers, GPS compass, GPS, E-stop, 5 lasers, camera, rador, control screen, steering monitor

probabilistic localization

Robotics
-> robot agent(sensor data) actions
environment agent

Perception
sensor data, internal state, filter

x’ = x + vΔ+cosΘ
y’ = y + vΔ+sinΘ
Θ’ = Θ + wΔt

3D vision

3D range, depth, distance

two stereo image
x2-x1/f = B/Z -> z = fB/x2-x1

SSD minimization
left -> normalize
right -> normalize => ()2 => Σpixel = value -> ssd

correspondence
cost of match, cost of occlusion

Dynamic programming n2
v(i, g) = max occl + v(i-1,j)

structure from motion
3d world, location of camera

structure from motion
3D world, location of camera

Image Formation

Equal triangles
x/z = x/f

vanishing points
x = X*f/z, y = Y*f/z

lens
1/f = 1/Z – 1/z

Computer vision
-classify object
-3D recognition

Invariance
0…255
+- mask

reasons to blur
-downs sampling
-noise reduction
I * (f * g)
Gaussian kernel

Harris Corner Detector
Σ(Ix)2 -> large
Σ(Iy)2 -> large

Modern feature detector
-localize
-unique signatures

technology

MDPs
POMDPs, Belief Space
Reinforcement Learning
A*; h function; Monte Carlo

chess, go, robot soccer, poker, hide-and-go-seek, card soliaire, minesweeper

s, p, actions(s, p), result(s,a), terminal(s), u(s, p)

deterministic, two-player, zero-sum

def maxValue(s):
m = -∞
for (a, s) in successors(s):
v = value(s’)
m = max(m, v)
return m

complexity
o(b)m

HMMs and Filters

Hidden Markov Model -HMMs
– analyse
– to predict time sence

Applications
– roboitcs
– medical
– finance
– speech
– language technology

HMMs follow bayes network
s1 -> s2 -> s3 -> ..Sn Markov chain
z1 z2 z3 z4

kalman filter, particle filter

localization problem
razor finder

speech recognition -> markov model
transition “I” to “a”

Hidden markov chain
P(R0)= 1
R(s0) = 0
p(R1) = 0.6
p(R2) = 0.44
P(R2) = 0.376

P(A1000)
P(A∞)lim t100 P(At)

stationary distribution
P(at) = P(at-1)
p(at|at-1)p(at-1) + P(at|t-1)P(bt-1)