Lecture3: Information Theory

Today’s class is about: Hypothesis testing collocations Info theory Hypothesis Testing Last lecture covered the methodology. Collocation “religion war” PMI, PPMI PMI = pointwise mutual information PMI = log2(P(x,y)/(P(x)P(y))) = I(x,y) PPMI = positive PMI = max(0, PMI) Example: Hong Kong, the frequency of “Hong” and “Kong” are low, but the frequency for “Hong Kong” … Read moreLecture3: Information Theory

The test speed of neural network?

Basically, the time spent on testing depends on: the complexity of the neural network For example, the fastest network should be the fully-connected network. CNN should be faster than LSTM because LSTM is sequential (sequential = slow) Currently, there are many ways to compress deep learning model (remove nodes with lighter weight) the complexity of … Read moreThe test speed of neural network?

Lecture 2: Lexical association measures and hypothesis testing

Pre-lecture Readings. Lexical association Named entities: http://www.nltk.org/book/ch07.html Information extraction architecture raw text->sentence segmentation->takenization0<part of speech tagging->entity detection->relation detection chunking: segments and labels multi-token sequences as illustrated in 2.1. Noun-phrase (NP) chunking tag patterns: describe sequences of tagged words Chunking with Regular Expressions Exploring Text Corpora Chinking: define a chink to be a sequence of tokens that is not included in … Read moreLecture 2: Lexical association measures and hypothesis testing

CMSC773: HW1

Question2: Word order: Explanation: Word order refers to the structure of a sentense: Alexa, when is your birthday? (Alexa answers) Alexa, when your birthday is? (Alexa answers) This will test whether alexa handles some wrong word order. Inflectional morphology: Explanation: Question 3: Question 4: Reference: http://statweb.stanford.edu/~serban/116/bayes.pdf Question 5: Question 6: New definitions: log-entropy weighting cosine … Read moreCMSC773: HW1

CMSC773: pre-coarse knowledge

Dependency Parsing Dependency: focuses on relations between words Typed: Label indicating relationship between words Untyped: Only which words depend Phrase structure: focuses on identifying phrases and their recursive structure Dependency parsing methods: Shift reduce: Predict from left to right Fast, but slightly less accurate MaltParser Spanning tree: Calculate full tree at once Slightly more accurate, … Read moreCMSC773: pre-coarse knowledge

1128

Challenges: Mathematical representation of model learning models parameters from data Overview: generative face model space of faces: goals each face represented as high dimensional vector each vector in high dimensional space represents a face Each face consists of shape vector Si Texture vector Ti Shape and texture vectors: Assumption: known point to point corrsp between … Read more1128

Iterator of set

deformation

Reference http://www.pmp-book.org/download/slides/Deformation.pdf https://zhuanlan.zhihu.com/p/25804146 Deformation energy Geometric energy to strtch and band thin shell from one shape t another as difference between first and second fundamental form First: stretching second: bending Approach: Given constraints (handle position / orientation, find surface that min deformation energy) Linear Approximation Energy based on fundamental forms in non-linear function of displacements … Read moredeformation

Nov14. Mesh Smoothing

Mesh smoothing: local averaging minimize local gradient energy in 3 dimensions Fourier transform (low pass filter) similar to local averaging idea image convolution F(A*B) = F(A) * F(B) Spectral Analysis In general: extending eigenvalues, eigenvectors to linear operators on (continuous) functions. Fourior transform: approximate signal as weighted sum (linear combination) of sines and cosines of … Read moreNov14. Mesh Smoothing