-
This post talks about how to use a biLSTM neural network to learn word-level and character-level representations and predict PoS tagging. The model achieves the state-of-art performance on the CoNLL2000 dataset.
-
This post briefly talks about common activation functions and gradient optimizers used in deep learning.
-
This post briefly talks about relevant mathematical concepts from linear algebra, calculus, probability, and optimization, which are discussed in detail to lay the mathematical foundation required for deep learning.
-
This post briefly provides a guide on the design of MapReduce algorithms. In particular, it presents a number of "design patterns" that capture effective solutions to common problems.
-
This post briefly talks about basics on MapReduce. Topics on mappers, reducers, partitioners, and combiners are included.
-
This post briefly talks about what antoencoders are and how to build them to compress and denoise images in TensorFlow.