1. How does it work?
2. How is it designed?
3. Where is it applied?
1. The quadrature mirror filter (QMF) does not work in general. It has only two coefficients (unless the IIR filters are used).
2. The conjugate quadrature filter (CQF) can work with larger filters (usually even number of filter coefficients).
For example:
oooooo---h_1 ------- x_h_0() -------- g_1 ----
x(n)----ooooooooooooooooooooooooooooo---x(n)
oooooo---h_0 ------- x_h_1() -------- g_0 ----
Fig. Two-channel filter bank.
1. For QMF if h_0={a,b}
h_1={a,-b} is the same as h_0 but negate every other value.
g_0={a,b} is the same h_0.
g_1={-a,b} is -h_1.
2. For CQF if h_0={a,b}
h_1={b,-a} is the reversed version of h_0 and negate every other value.
g_0={b,a} is the reversed version of h_0.
g_1={-a,b} is the reversed version of h_1.
Tuesday, September 15, 2009
Friday, September 11, 2009
Simulated Annealing (SA) algorithms
1. Always accept the better state.
2. Accept the lower state with the probability e^(-Delta/T).
Delta = h(s_current)-h(s_new). T is the temperature parameter that is having its values decreased over each iteration.
*Note: The above algorithm tries to find the global maximum.
Properties
1. At the high temperature, there is a higher chance to accept the lower state.
2. At the low temperature, the SA algorithm behaves like greedy search.
2. Accept the lower state with the probability e^(-Delta/T).
Delta = h(s_current)-h(s_new). T is the temperature parameter that is having its values decreased over each iteration.
*Note: The above algorithm tries to find the global maximum.
Properties
1. At the high temperature, there is a higher chance to accept the lower state.
2. At the low temperature, the SA algorithm behaves like greedy search.
Thursday, September 10, 2009
Markov models and Hidden Markov Models (HMMs)
1. What is Markov model?
It consists of a set of states and a set of transition probabilities from one state to another.
2. How does it work? We represent the state transition probabilities in matrix form and compute the probability of the sequence as follows.
P(s1s2s3)=P(s3|s2)P(s2|s1)P(s1|start_state)
* Note: the conditional probabilities are the transition probabilities.
3. Where is it applied?
When we are given a Markov model, we can compute the probability of the sequence occurred, for example, P(s1s2s3). If we are given two models and each model represents for each class, then we can compute the probability of the sequence occurred for each class. After that, we can decide which class the sequence s1s2s3 belongs to based on the comparison of these two probabilities. The MM can be applied in speech recognition, classification of gene (DNA).
1. What is HMM?
It consists of transition probabilities; and states with the probability for each value belong to states.
2. Where is it applied? It can be applied in speech recognition.
3. How does it work?
It consists of a set of states and a set of transition probabilities from one state to another.
2. How does it work? We represent the state transition probabilities in matrix form and compute the probability of the sequence as follows.
P(s1s2s3)=P(s3|s2)P(s2|s1)P(s1|start_state)
* Note: the conditional probabilities are the transition probabilities.
3. Where is it applied?
When we are given a Markov model, we can compute the probability of the sequence occurred, for example, P(s1s2s3). If we are given two models and each model represents for each class, then we can compute the probability of the sequence occurred for each class. After that, we can decide which class the sequence s1s2s3 belongs to based on the comparison of these two probabilities. The MM can be applied in speech recognition, classification of gene (DNA).
1. What is HMM?
It consists of transition probabilities; and states with the probability for each value belong to states.
2. Where is it applied? It can be applied in speech recognition.
3. How does it work?
Wednesday, September 9, 2009
Evolutionary algorithms
1. Select P in population.
2. Evaluate P.
3. If P satisfies the termination condition then stop
else select P' in P such that the fitness objective function achieves best.
4. Apply crossover or/and mutation operators to P' to reproduce P.
5. Go back to step 2.
2. Evaluate P.
3. If P satisfies the termination condition then stop
else select P' in P such that the fitness objective function achieves best.
4. Apply crossover or/and mutation operators to P' to reproduce P.
5. Go back to step 2.
Subscribe to:
Posts (Atom)
Mounting USB drives in Windows Subsystem for Linux
Windows Subsystem for Linux can use (mount): SD card USB drives CD drives (CDFS) Network drives UNC paths Local storage / drives Drives form...
-
I. Five different ways to answer a question II. Use SOLO strategies to explain our thinking and reasoning III. SOLO Taxono...
-
Learning levels 1, 2, and 3 Learning levels 4, 5, and 6 References http://www.cccs.edu/Docs/Foundation/SUN/QUESTIONS%20FOR%20TH...