Replies
Welcome, guest
Join CrazyEngineers to reply, ask questions, and participate in conversations.
CrazyEngineers powered by Jatra Community Platform
-
@jeffrey-xA7lUP • Feb 26, 2013
Consider an Hybrid circuit (Which uses both AC and DC supply) like an Common Emitter amplifier Here there are two inputs. One is the DC supply and the other is the AC input at the Base
To find the response of the system with only the DC supply and zero AC supply we do DC analysis.
To find the response of the amplifier to the AC input only we use AC analysis
And no circuit in practice gives the desired output instantly as supply is turned on. There is a time called settling time which charges all the dynamic components to the required value. This time taken to do these initial setting ups is called as Transient time and the analysis here is called transient time -
@abhishek-fg9tRh • Feb 26, 2013
First of all,Priyanka S GWhat is difference between ac analysis,dc analysis and transient analysis?
What's Analysis ?
In any Network circuit, there are numerous capacitors,resistors,dependent & independent sources.
Now, using the basic concepts like KVL,KCL,initial condition,Millman's Theorem,Norton's Theorem,Superposition Theorem, etc. you are understanding each & every component of the network (like how much energy x capacitor stores at some specific time, how much voltage is obtained at Y node).This is called Analysis.
Now AC Analysis & DC Analysis are almost same with some changes.
Transient Analysis - I have no idea !
Let's make it an interesting/educating debate :
Why an Amplifier uses AC & DC supply both when only AC supply is to be amplified ? 😉ConquerorConsider an Hybrid circuit (Which uses both AC and DC supply) like an Common Emitter amplifier Here there are two inputs. One is the DC supply and the other is the AC input at the Base
(Just a debate) -
@jeffrey-xA7lUP • Feb 27, 2013
The most important thing we need to understand is this OUTPUT POWER CAN NEVER EXCEED input power.Troll_So_HardFirst of all,
Let's make it an interesting/educating debate :
Why an Amplifier uses AC & DC supply both when only AC supply is to be amplified ? 😉
(Just a debate)
This said let us consider the input AC power of an amplifier in some milli watts. The amplifier output is always in the range of Watts.
Here we find input power is way to less than the output AC power got from the amplifier. This difference in power is got from the DC source applied to the amplifier circuit.
SO now let us define an AMPLIFIER
It is a circuit which uses DC power from a DC source to magnify the AC input signal to the desired power rating