Sunday, October 2, 2011
Quicksort algorithm
Quicksort algorithm has the worst-case time complexity of O(n^2), and the best-case time complexity of O(nlogn).
Friday, September 2, 2011
Problem solving as search
Problem solving as search
Uninformed search
I/ Breath-first search
II/ Depth-first search
Depth-first search is not optimal and incomplete.
"Incomplete" means it can start down an infinite path and never return to try other possibilities.
Informed search
I/Best-first search approach
1. Greedy best-first search
"Greedy" means at each step it tries to get as close to the goal node as it can.
2. A* search
A* search tries to minimize the total estimated solution cost.
f(n)=g(n) + h(n)
h(n) represents the heuristic cost from the considering node to the goal.
g(n) represents the cost from the current node to the considering node.
A* search is both complete and optimal.
A* search is optimal if h(n) is an admissible heuristic. That means h(n) never overestimates the cost to reach the goal.
A* tends to run out of space faster than out of time as it keeps all the previous generated nodes.
Reference
Stuart Russel and Peter Norvig "Artificial intelligence: a modern approach", 2nd Ed, 2003.
Uninformed search
I/ Breath-first search
II/ Depth-first search
Depth-first search is not optimal and incomplete.
"Incomplete" means it can start down an infinite path and never return to try other possibilities.
Informed search
I/Best-first search approach
1. Greedy best-first search
"Greedy" means at each step it tries to get as close to the goal node as it can.
2. A* search
A* search tries to minimize the total estimated solution cost.
f(n)=g(n) + h(n)
h(n) represents the heuristic cost from the considering node to the goal.
g(n) represents the cost from the current node to the considering node.
A* search is both complete and optimal.
A* search is optimal if h(n) is an admissible heuristic. That means h(n) never overestimates the cost to reach the goal.
A* tends to run out of space faster than out of time as it keeps all the previous generated nodes.
Reference
Stuart Russel and Peter Norvig "Artificial intelligence: a modern approach", 2nd Ed, 2003.
Friday, October 2, 2009
Video compression
Parameters to affect digital video quality
1. Number of scan lines in each frame
2. Number of pixels in each scan line
3. Number of frames per second
4. Scanning techniques
Video compression makes use of two different compression techniques
1. Partial redundancy (intra-frame compression)
2. Temporal redundancy (inter-frame compression)
1. Number of scan lines in each frame
2. Number of pixels in each scan line
3. Number of frames per second
4. Scanning techniques
Video compression makes use of two different compression techniques
1. Partial redundancy (intra-frame compression)
2. Temporal redundancy (inter-frame compression)
Tuesday, September 15, 2009
What is quadrature mirror filter (QMF) and conjugate quadrature filter (CQF)?
1. How does it work?
2. How is it designed?
3. Where is it applied?
1. The quadrature mirror filter (QMF) does not work in general. It has only two coefficients (unless the IIR filters are used).
2. The conjugate quadrature filter (CQF) can work with larger filters (usually even number of filter coefficients).
For example:
oooooo---h_1 ------- x_h_0() -------- g_1 ----
x(n)----ooooooooooooooooooooooooooooo---x(n)
oooooo---h_0 ------- x_h_1() -------- g_0 ----
Fig. Two-channel filter bank.
1. For QMF if h_0={a,b}
h_1={a,-b} is the same as h_0 but negate every other value.
g_0={a,b} is the same h_0.
g_1={-a,b} is -h_1.
2. For CQF if h_0={a,b}
h_1={b,-a} is the reversed version of h_0 and negate every other value.
g_0={b,a} is the reversed version of h_0.
g_1={-a,b} is the reversed version of h_1.
2. How is it designed?
3. Where is it applied?
1. The quadrature mirror filter (QMF) does not work in general. It has only two coefficients (unless the IIR filters are used).
2. The conjugate quadrature filter (CQF) can work with larger filters (usually even number of filter coefficients).
For example:
oooooo---h_1 ------- x_h_0() -------- g_1 ----
x(n)----ooooooooooooooooooooooooooooo---x(n)
oooooo---h_0 ------- x_h_1() -------- g_0 ----
Fig. Two-channel filter bank.
1. For QMF if h_0={a,b}
h_1={a,-b} is the same as h_0 but negate every other value.
g_0={a,b} is the same h_0.
g_1={-a,b} is -h_1.
2. For CQF if h_0={a,b}
h_1={b,-a} is the reversed version of h_0 and negate every other value.
g_0={b,a} is the reversed version of h_0.
g_1={-a,b} is the reversed version of h_1.
Friday, September 11, 2009
Simulated Annealing (SA) algorithms
1. Always accept the better state.
2. Accept the lower state with the probability e^(-Delta/T).
Delta = h(s_current)-h(s_new). T is the temperature parameter that is having its values decreased over each iteration.
*Note: The above algorithm tries to find the global maximum.
Properties
1. At the high temperature, there is a higher chance to accept the lower state.
2. At the low temperature, the SA algorithm behaves like greedy search.
2. Accept the lower state with the probability e^(-Delta/T).
Delta = h(s_current)-h(s_new). T is the temperature parameter that is having its values decreased over each iteration.
*Note: The above algorithm tries to find the global maximum.
Properties
1. At the high temperature, there is a higher chance to accept the lower state.
2. At the low temperature, the SA algorithm behaves like greedy search.
Thursday, September 10, 2009
Markov models and Hidden Markov Models (HMMs)
1. What is Markov model?
It consists of a set of states and a set of transition probabilities from one state to another.
2. How does it work? We represent the state transition probabilities in matrix form and compute the probability of the sequence as follows.
P(s1s2s3)=P(s3|s2)P(s2|s1)P(s1|start_state)
* Note: the conditional probabilities are the transition probabilities.
3. Where is it applied?
When we are given a Markov model, we can compute the probability of the sequence occurred, for example, P(s1s2s3). If we are given two models and each model represents for each class, then we can compute the probability of the sequence occurred for each class. After that, we can decide which class the sequence s1s2s3 belongs to based on the comparison of these two probabilities. The MM can be applied in speech recognition, classification of gene (DNA).
1. What is HMM?
It consists of transition probabilities; and states with the probability for each value belong to states.
2. Where is it applied? It can be applied in speech recognition.
3. How does it work?
It consists of a set of states and a set of transition probabilities from one state to another.
2. How does it work? We represent the state transition probabilities in matrix form and compute the probability of the sequence as follows.
P(s1s2s3)=P(s3|s2)P(s2|s1)P(s1|start_state)
* Note: the conditional probabilities are the transition probabilities.
3. Where is it applied?
When we are given a Markov model, we can compute the probability of the sequence occurred, for example, P(s1s2s3). If we are given two models and each model represents for each class, then we can compute the probability of the sequence occurred for each class. After that, we can decide which class the sequence s1s2s3 belongs to based on the comparison of these two probabilities. The MM can be applied in speech recognition, classification of gene (DNA).
1. What is HMM?
It consists of transition probabilities; and states with the probability for each value belong to states.
2. Where is it applied? It can be applied in speech recognition.
3. How does it work?
Wednesday, September 9, 2009
Evolutionary algorithms
1. Select P in population.
2. Evaluate P.
3. If P satisfies the termination condition then stop
else select P' in P such that the fitness objective function achieves best.
4. Apply crossover or/and mutation operators to P' to reproduce P.
5. Go back to step 2.
2. Evaluate P.
3. If P satisfies the termination condition then stop
else select P' in P such that the fitness objective function achieves best.
4. Apply crossover or/and mutation operators to P' to reproduce P.
5. Go back to step 2.
Subscribe to:
Comments (Atom)
Mounting USB drives in Windows Subsystem for Linux
Windows Subsystem for Linux can use (mount): SD card USB drives CD drives (CDFS) Network drives UNC paths Local storage / drives Drives form...
-
Windows Subsystem for Linux can use (mount): SD card USB drives CD drives (CDFS) Network drives UNC paths Local storage / drives Drives form...
-
I. Five different ways to answer a question II. Use SOLO strategies to explain our thinking and reasoning III. SOLO Taxono...