Friday, August 15, 2014

Three Laws of Thermodynamics and Thermodynamic Equilibrium

The four laws of thermodynamics are:[1][2][3][4][5][6]
  • Zeroth law of thermodynamics: If two systems are in thermal equilibrium with a third system, they must be in thermal equilibrium with each other. This law helps define the notion of temperature.
  • Third law of thermodynamics: The entropy of a system approaches a constant value as the temperature approaches absolute zero. With the exception of glasses the entropy of a system at absolute zero is typically close to zero, and is equal to the log of the multiplicity of the quantum ground state.

Thermodynamic Equilibrium :- If a system satisfy - 
1. Thermal Equilibrium
2. Mechanical Equilibrium
3. Chemical Equilibrium
then it is called to be in thermodynamic equilibrium. All system has a nature to tend to equilibrium. Say a function f(u,v) = c was in equilibrium at one combination of u,v value, now if u changes v will be try to be changed to reach the equilibrium once again. 

Arrow Of Time

As prescribed by Sir S.W. Hawking the three arrow of time are as following :-

1. The total Entropy of the Universe is increasing.
2. The more complex structures are arising (Ex :- Biological Evolution).
3. The Universe is expanding.

As entropy is increased the more complex structure will come automatically over the time. The Universe also expands due to the outgoing forces dominates the central attractive gravitation force.


Entropy

In thermodynamics Entropy is a measure of disorder in the system. This term is closely related to the heat. The change of Entropy is related to the reversible process and loosely equals to the change of heat in the reversible process by the temperature.





The Entropy is firstly introduce by the scientist  Lazare Carnot in his 1803 paper Fundamental Principles of Equilibrium and Movement. It seems from his paper that in any natural process there exists an inherent tendency towards the dissipation of useful energy.

This entropy has an enormous effect to define the arrow of time. As the time pass the total entropy of system and surrounding is ever increasing. This defines arrow of time.


Thursday, August 14, 2014

Percolation Theory

Def :- Its a subject describing connected clusters in random graph.

The main objective of this subject could be explained by the following example :- Suppose a liquid is pour on the top of a porous material. Will the liquid be able to  make its way from hole to hole and reach the bottom?

The problem is simulated as follow. A n*n or higher size grid is assumed and where each edge/bond is opens with probability p and close with probability 1-p.  We have to find that whether there is a way from one side of the grid to other side. 

Percolation Graph in a 3D grid
Bond Percolation
One interesting result is here. We can found a probability p* higher than this the graph will almost be connected and lower than this the graph has higher probability to not to be connected.

Tuesday, August 12, 2014

Graph Coloring Problem

Graph :-  Its a order pair G(V,E) where V is the set of vertices and E is the set of edges joining some of those those vertices. Thus a graph G(V,E)  where V = {v | v is a vertex of G} and E = {(u,v) | u,v ∈ V}. 

Graph Coloring :- 
           Vertex Coloring :- Find a vector F = (fv) ,where fv is called the color of vertex v, such that  fv ≠ fu  if (u,v) ∈ E .
           
           Edge Coloring: Find a vector F = (fe) , where fis called the color of edge e, such that f≠ fg if there is a common vertex u in terminals ( end points of an edge is called terminal . A terminal is one of those pair of vertices which form that edge. ) of e and g.

Algorithm :- 
                Frequency Exhaustive :- Choose an order of the vertices and color them one by one following that order. While coloring vertex v choose the minimum color that satisfy all constraints with so far colored vertices.

In this strategy we will put color 0 to v1 and minimum color that satisfy constraints with v1 to v2 and and minimum color that satisfy constraints with v1 and v2 to v3 so on ...  

              Requirement Exhaustive :- Choose an order of the vertices. Choose color 0 and following the order color those vertices with color 0 which can take this color with satisfying constraints with so far colored vertices. Now go to color 1 and do the same and so on...

This Graph coloring problem has enormous application in networking. Specially in networking we have to allocate channels or frequency to the Mobile Stations ( Mobile, Notebooks etc ) where channel assignment is reduced to a graph coloring problem. 

Wednesday, November 13, 2013

Collision

Collision is an important fact in Natural life. In this article we are going to discuss the collision between Natural bodies in physical science major.

Let, 2 point objects say O1, O2 with mass m1, m2 respectively collide with initial velocity u1,u2 and final velocity v1,v2 with Coefficient of restitution α = (v1 - v2 ) / (u2- u1), the ratio of final and initial relative velocities.

Now we will have following 2 equations in hand :-
1.  α = (v1 - v2 ) / (u2- u1)
2.  m1v1 + m2v2 = m1u1  + m2u, The Conservation of Moments.

Now solving them we get

v2  = v0  + μ2. α. ( u2 - u1)
v1  = v0  + μ1. α. ( u1 - u2)

Where
v0 = ( m1u+ m2u) / ( m1 + m2) , The velocity of the cm of the system.
μ2 =   m/ ( m1 + m2) , Mass partition.

Now, if :-
1)  α = 0 then v1 =  v2 ,Totally inelastic collision.
2)  α = 1 then v1 +  u1  =  v+ u2 ,Totally elastic collision.
3) Else, α ∈ (0,1) Natural Collision. :)

In only Natural and Inelastic collision Kinatic energy will be changed into other energy ( Heat , Internal energy etc ) For amount of mechanical energy loss just calculate

Q = 1/2 ( m1v12 + m2v2- m1u12 - m2u2)


Saturday, July 6, 2013

Markov Chain

Many time we face some problem which can be formulated as a set S = {si} of state and a the probability to go from one state to another is given . A subset of that kind of problem is Markov chain .

Def(Markov Chain) :- A process organised on a set of states S = {si} where inter state transitional probability only depends on previous and next state i and j. So we can create a Transitional matrix P = {pij} consist of conditional probability to go from state j to i.  

We can also say that P is representing the matrix {pnij} where pnij is the probability to go state i from j in n steps.

Similarly we can say that Lt (nP =  .