python for stock data

features:
1. strong scientific libraries
2. strongly maintained
3. fast

install pandas into Centos

$ sudo easy_install pandas

>>> import numpy
>>> numpy.version.version
'1.13.3'

Print last 5 rows of the data frame

import pandas as pd


def test_run():
    df = pd.read_csv("data/AAPL.csv")
    print(df[-5:])

if __name__ == "__main__":
    test_run()

compute max closing price of Apple and IBM

import pandas as pd

def get_max_close(sympol)
	
	df = pd.read_csv("data/{}.csv".format(symbol))
	return df['Close'].max()

def test_run():
	for symbol in ['AAPL', 'IBM']:
		print "Max close"
		print symbol, get_max_close(symbol)

if __name__ == "__main__": # if run standalone
	test_run()

Creating Paper Transformations

ViewAnimationUtils.createCircularReveal(
View view,
int centerX,
int centerY,
float startRadius,
float endRadius
)

@Override
public void onClick(View view){
	boolean isVeggie = ((ColorDrawable)viewe.getBackground()) != null && ((ColorDrawable)view)

	int finalRadius = (int)Math.hypot(view.getWidth()/2, view.getHeight()/2);

	if (isVeggie){
		text1.setText(baconTitle);
		text2.setText(baconText);
		view.setBackgroundColor(white);
	} else {
		Animator anim = ViewAnimationUnits.createCircularReveal(
			view, (int) view.getWidth()/2, (int) view.getHeight()/2, 0, finalRadius);
		text1.setText(veggieTitle);
		text2.setText(veggieText);
		view.setBackgroundColor(green);
		anim.start();
	}
}

res/layout/activity.xml

<android.support.design.widget.CoordinatorLayout
	xmlns:android="http://schemas.android.com/apk/res.android"
	xmlns:app="http://schemas.android.com/apk/res-auto"
	...>

CoordinatorLayout
->AppBarLayout
android:layout_height = “168dp”
android:background=”@color/indigo_500″
-> CollapsingToolbarLayout
app:layout_scrollFlags=”scroll|exitUnitCollapsed”>
-> Toolbar
android:layout_height=”56dp”
app:layout_collapseMode=”pin” />
RecyclerView
app:layout_behavior=”@string/appbar_scrolling_view_behavior” />

Color palette

<?xml version="1.0" encoding="utf-8"?>
<resources>
	<color name="indigo_300">#7986CB</color>
	<color name="indigo_500">#3F51B5</color>
	<color name="indigo_700">#303F9F</color>
	<color name="pink_a200">#FF4081</color>
</resources>

https://developer.android.com/training/material/theme.html

Fonts within a font family and weight, style

onClick

TransitionManager.go(
	Scene.getSceneForLayout(
	(ViewGroup) findViewById(R.id.root),
	R.layout.activity_main_scene_info,
	MainActivity.this));
res/transition/grid_exit.xml
<explode xmlns... />

res/values/styles.xml
<style name="AppTheme.Home">
	<item name="android:windowExitTransition">
		@transition/grid_exit</item>
</style>

Implementing Surfaces

<LinearLayout
	android:layout_width="match_parent"
	android:layout_height="wrap_content"
	android:orientation="vertical">

	<FrameLayout
		android:layout_width="match_parent"
		android:layout_height="200dp"
		android:layout_margin="16dp"
		android:background="#fff"
		android:elevation="4dp" />

	<FrameLayout
		android:layout_width="match_parent"
		android:layout_height="200dp"
		android:layout_margin="16dp"
		android:background="#fff"
		android:elevation="8dp" />

	<FrameLayout
		android:layout_width="match_parent"
		android:layout_height="200dp"
		android:layout_margin="16dp"
		android:background="#fff"
		android:elevation="16dp" />

</LinearLayout>

f-a-b => FAB

dependencies {
	compile fileTree(dir: 'libs', include: ['*.jar'])
	compile 'com.android.support:appcompat-v7:22.2.0'
	compile 'com.android.support:design:22.2.0'
}

activity_main.xml

<android.support.design.widget.FloatingActionButton
	app:fabSize="normal"
	app:elevation="6dp"
	app:layout_gravity="end"
	app:pressedTranslationZ="12dp"

	android:id="@+id/fab"
	android:src="@drawable/fab_plus"

	android:layout_height="wrap_content"
	android:layout_width="wrap_content"
	android:layout_margin="16dp"
/>

styles.xml

<?xml version="1.0" encoding="utf-8"?>
<resources>
	<style name="AppTheme" parent="android:Theme.Material.Light">
	</style>
</resources>

<?xml version="1.0" encoding="utf-8"?>
<resources>
	<style name="AppTheme" parent="Theme.AppCompat.Light">
	</style>
</resources>

Android Design

Working with density-independent pixels
1px / 1dp = 160dpi / 160dpi
2px / 1dp = 320dpi / 160dpi
※ dpi = dot per inch
7inch nexus^7 1280x800pixel => 960x600dp

Density buckets
LDPI, MDPI, HDPI, XHDPI, XXHDPI, XXXHDPI

res/drawable/checkbox.xml

<selector>
	<item android:state_pressed="true"
		android:state_checked="true"
		android:drawable="@drawable/box_checked_pressed">

	<item android:state_pressed="true"
		android:drawable="@drawable/box_pressed">

	<item android:state_checked="true"
		android:drawable="@drawable/box_checked">

	<item android:drawable="@drawable/box_default">

</selector>

content, padding, margins
FrameLayout, LinearLayout, RelativeLayout, GridLayout, ScrollView, ListView, ViewPager

styles.xml

<resources>

	<!-- Base application theme. -->
	<style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
		<!-- Customize theme here -->
	</style>

	<style name="MyStyle">
		<item name="android:textColor">#FF255F26</item>
	</style>

	<style name="AnotherStyle">
		<item name="android:textColor">#1D175F</item>
		<item name="android:textStyle">bold</item>
	</style>
</resources>

Time Series Forecasting

time series forecasting to predict values for following business situations.
– Monthly beach bike rentals
– A stock’s daily closing value
– Annual sheep population

Average Method
The best predictor of what will happen tomorow is the average of everything that has happened up until now.

Moving average method

Naive Method
If there is not enough data to create a predictive model, the Naive method can supplement forecasts for the near future.

Seasonal Naive Method
Assumes that the magnitude of the seasonal pattern will remain constant.

Exponential Smoothing Model
Past Observations, Weighted Average

Bayesian Interface

Representing and reasoning with probabilities
Bayesian Networks

Joint Distribution

Strom lightning thonder
T, T .25 .20,.05
T, F .40 .04,0.36
F, T .05 .04,0.01
F, F .30 .03,0.27
Random day 2pm – look outside summer
Pr(¬storm) = 0.35
pr(lightning|storm) = .4615 (.25/.65)

X is conditionally independent of Y given Z fi the probability distribution governing X is independent of the value of y given the value of Z; that is, if
P(X=x|Y=y,Z=z)= P(X=x|Z=x)
more compactly write
P(X|Y,Z) = P(X|Z)

sampling
two things distribution are for -probability of value, generate value
simulation of a complex process
approximate inference

P(x)= Σy*P(x,y)
P(x,y)= P(x)*P(y|x)
P(y|x) = P(x|y)*P(y)/P(z)

Bayesian Learning

Learn the best hypothesis given data
$ some domain knowledge

Learn the most probable H given data
$ domain knowledge

Pr(h|D)
Pr(h|D) = Pr(D|h)*Pr(h) / Pr(D) … Bayes’ rule
Pr(a,b) = Pr(a|b)P(b)
Pr(b,a) = Pr(b|a)P(a)

Bayesian Learning
For each h e H
calculate Pr(h|D) = P(D|h)P(h)/P(D)
Output:
h = argmax Pr(h|D)
h = argmax Pr(D|h)

VC Dimensions

Infinite Hypothesis Spaces
m>= 1/ε(ln|H|+ln1/γ)

spaces are infinite
– linear separators
– artificial neural networks
– decision trees (continuous input)

X:{1,2,3,4,5,6,7,8,9,10}
H: h(x) = x>=Θ
|H|=∞

Trade all hypotheses (only track non-negative integer), keep version space

X = R
H = {h(x) = xe[a,b]}
parameterized by a,b e R
VC = 2

Computational Learning Theory

Mondrain Composition
https://www.khanacademy.org/humanities/ap-art-history/later-europe-and-americas/modernity-ap/a/mondrian-composition
Colored Vornoi Diagram

support vector machines SVMs perceptron
Nearest neighbor 1-NN
decision trees

-defining learning problems
-showing specific algorithms work
-show these problems are fundamentally hard

Theory of computing analyzes how use resources: time, space, o(nlogn), o(n^2)

Inductive learning
1.probability of successful training
2.number of examples to train on
3.complexity of hypothesis class
4.accuracy to which target concept is approximated
5.manner in which training examples presented
6.manner in which training examples selected

computational complexing
– how much computational effort is needed for a learner to coverage?
sample complexing -batch
– how many training examples are needed for a learner to create a successful hypothesis
mistake bounds – online
– how many misclassfications can a learner make over an infinite run

true hypothesis: ceH Training set:s<=X candidate hypothesis: heH consistent learner: produce c(x)=h(x) far xeS version space: VS(s) = {h s.t. heH consistent wants} hypotheses consistent with examples errord(h) = Prx~d[c(x)=h(x)]