Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
量子情報と熱力学入門
沙川 貴大 東京大学大学院 総合文化研究科
若手のための量子情報基礎セミナー 2014年8月9日,京都大学基礎物理学研究所
WHO
• 2002-2006 京都大学理学部(学部) – 卒業研究:量子光学研究室
• 2006-2008 東京工業大学 上田研究室(修士)
• 2008-2011 東京大学 上田研究室(博士)
• 2011-2012 京都大学 白眉センター 特定助教 – 受入先:基礎物理学研究所 統計動力学研究室
• 2013- 東京大学大学院 総合文化研究科 准教授
専門分野
• 非平衡統計力学、量子情報
とくに、情報処理過程の非平衡統計力学
Collaborators on Information Thermodynamics
• Masahito Ueda (Univ. Tokyo)
• Shoichi Toyabe (LMU Munich)
• Eiro Muneyuki (Chuo Univ.)
• Masaki Sano (Univ. Tokyo)
• Sosuke Ito (Univ. Tokyo)
• Naoto Shiraishi (Univ. Tokyo)
• Sang Wook Kim (Pusan National Univ.)
• Jung Jun Park (National Univ. Singapore)
• Kang-Hwan Kim (KAIST)
• Simone De Liberato (Univ. Paris VII)
• Juan M. R. Parrondo (Univ. Madrid)
• Jordan M. Horowitz (Univ. Massachusetts)
• Jukka Pekola (Aalto Univ.)
• Jonne Koski (Aalto Univ.)
• Ville Maisi (Aalto Univ.)
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
Thermodynamics in the Fluctuating World
Thermodynamics of small systems
Second law of thermodynamics
Nonequilibrium thermodynamics
Thermodynamic quantities are fluctuating!
Information Thermodynamics
Information processing at the level of thermal fluctuations
Foundation of the second law of thermodynamics
Application to nanomachines and nanodevices
System Demon
Information
Feedback
Szilard Engine (1929)
Heat bath
T
Initial State Which? Partition
Measurement
Left
Right
Feedback
ln 2
F E TS Free energy: Decrease by feedback Increase
Isothermal, quasi-static expansion B ln 2k T
Work
Can control physical entropy by using information
L. Szilard, Z. Phys. 53, 840 (1929)
Information Heat Engine
B ln 2k TB ln 2k T
Controller
Small system
Can increase the system’s free energy even if there is no energy flow between the system and the controller
Information
Feedback
Experimental Realizations (yet classical)
• With a colloidal particle Toyabe, TS, Ueda, Muneyuki, & Sano, Nature Physics (2010)
Efficiency: 30% Validation of
• With a single electron Koski, Maisi, TS, & Pekola, PRL (2014)
Efficiency: 75% Validation of
( )W Fe
( ) 1W F Ie
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
Claude Shannon (1916-2001)
Father of information science
Established information theory in his 1948 papers (Introduced and analyzed the Shannon information and the mutual information)
Standard textbooks of information theory: - T. M. Cover and J. A. Thomas, “Elements of Information Theory” (John Wiley and Sons, New York, 1991). - M. A. Nielsen and I. L. Chuang, “Quantum Computation and Quantum Information” (Cambridge University Press, Cambridge, 2000), chapter 11.
C. Shannon, Bell System Technical Journal 27, 379-423 and 623-656 (1948).
Shannon Information (1)
9
10
1
10
Information content with event : 1
lnkp
k
Shannon information: 1
lnk
k k
H pp
Average
Shannon Information (2)
NH ln0
1ip
0kp
for a single i
for ik
Npk /1for all k
)1ln()1(ln ppppH
2ln0 H
Ex. Binary system “0”: probability p
“1”: probability 1-p
k
kk ppH ln
N: the number of k’s
Mutual Information (1)
System S Memory M
Measurement or communication (with stochastic error, in general)
0
1
0
1
1
1
Ex. Binary symmetric channel
( , ) ( | ) ( )p s m p m s p s
( | )p m s : conditional probability characterizing the error
: joint distribution of S and M
( )p s
( )p m
: distribution of the measured state of S
: distribution of the outcome in M
Mutual Information (2)
( : ) ( ) ( ) ( )
( , )( , ) ln
( ) ( )sm
I S M H S H M H SM
p s mp s m
p s p m
( , ) ( | ) ( )p s m p m s p s
( | )p m s : conditional probability characterizing the error
: joint distribution of S and M
( )p s
( )p m
: distribution of the measured state of S
: distribution of the outcome in M
sm
mspmspSMH ),(ln),()(s
spspSH )(ln)()( m
mpmpMH )(ln)()(
System S Memory M I
Mutual Information (3)
Ex. Binary channel
)(0 MHI
0
1
0
1 1
0
11
01 p
1-p
S M
psp )0( psp 1)1(
0( 0 | 0) 1p m s
0( 1| 0)p m s
1( 1| 1) 1p m s
1( 0 | 1)p m s
0 1( ) ( ) (1 ) ( )I H M pH p H
( ) ln (1 )ln(1 )H x x x x x
I
2/1p
2ln
0 1
Mutual Information (4)
0 ( )I H M
No information Error-free
tt 1)ln(Proof of the left inequality:
011),(
)()(1),(
),(
)()(ln),(
smsm msP
mPsPmsP
msP
mPsPmsP
Equality is achieved if )()(),( mPsPmsP
(S and M are independent)
Mutual Information (5)
0 ( )I H M
No information Error-free
Proof of the right inequality:
0
)|(ln)|()(
),(
)()(ln),()(ln)()(
s m
smm
smPsmPsP
msP
mPsPmsPmPmPIMH
Equality is achieved if, for every s, there is a unique m such that 1)|( smP
Conditional Shannon information )|( SMH
Mutual Information: Summary
0 ( )I H M
No information Error-free
System S Memory M I
( : ) ( ) ( ) ( )I S M H S H M H SM
System S Memory M
(measurement device)
Measurement with stochastic errors
Shannon information: Randomness of the system
Mutual information: Correlation between the system and the memory
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
Quantum State
Density operator: i iiiq
Statistical mixture of state vector with probability iqi
Note: Such a decomposition to state vectors is not unique!
Non-negativity: 0
Unity: 1][tr i iq
Quantum Dynamics
k
kk MM †)(
dk kk IMM †
Kraus representation of nonunitary dynamics:
’s are Kraus operators with kM
][tr)]([tr Trace-preserving (TP):
Completely positive (CP): dΙ is positive with any ancillary system
Any CPTP map has a Kraus representation
(identity)
Quantum Measurement
Projection operators { }kP
tr( )k kp P
Projection measurement (error-free)
1k k
k
P Pp
Probability
Post-measurement state
k k
k
A PObservable ,{ }k iMKraus operators
tr( )k kp E
General measurement
k
k
E I
†1ki ki
ik
M Mp
†
k ki ki
i
E M M
Probability
Post-measurement state
POVM:
k : measurement outcome
{ }kE
Quantum Entropy
lntr)( S
: density operator
Von Neumann entropy:
Characterizes the randomness of the classical mixture in the density operator
ii i qqS ln)( i iiiq
with an orthonormal basis
Von Neumann and Thermodynamic Entropies
The von Neumann entropy is consistent with thermodynamic entropy in the canonical distribution
Canonical distribution:
Free energy: Average energy:
)(
can
EFe
EeTkF trlnB EE cancan
tr
thermcanTSFE Cf.
Thermodynamic entropy
cancanBcanlntr TkFE
E: Hamiltonian
Variety of Quantum Information
• Quantum mutual information
• Quantum discord
• Holevo’s chi
• QC-mutual information (Groenewold information)
Bipartite Quantum System
AB
A B
][tr ABBA ][tr ABAB
Quantum Mutual Information (1)
)()()()( ABBAAB:BA SSSI
0)( AB:BA I )()()( ABBA SSS
(Subadditivity of von Neumann entropy)
Purely classical correlation ⇒ Classical mutual information
xy yyxxxyq AB with orthonormal bases
xy yx
xy
xyqq
qqI ln)( AB:BA
Quantum Mutual Information (2)
0000AB
Example: 2 qubits
(no correlation) 0)( AB:BA I
111100002
1AB
(classical correlation)
2ln)( AB:BA I
11000011111100002
1AB
(entanglement)
2ln2)( AB:BA I
Quantum Mutual Information (3)
Data processing inequality
])[EE()( ABBA:BAAB:BA II
AE BE : CPTP maps on individual systems ,
Straightforward consequence of the monotonicity of the quantum relative entropy
Quantum mutual information is decreased by quantum operations on individual systems
Quantum Discord (1)
xxxP xx : orthonormal basis of A
x xx PP ABAB
)()( AB:BAAB:BA
II (Data processing inequality)
0)()()( AB:BAAB:BAABA|B II
Quantum discord:
)(min)( ABA|BABA|B
Quantum Discord (2)
Example: 2 qubits
111100002
1AB
(classical correlation)
0)( ABA|B
11000011111100002
1AB
(entanglement)
2ln)( ABA|B
Quantifying “quantum correlation”
Quantum Discord (3)
Non-zero discord without entanglement
Ollivier & Zurek, PRL 88, 017901 (2001)
zIz
4
1AB 1100
2
1
Holevo’s chi (1)
xxp : classical probability
: density operator x xxp
Holevo’s chi: x xxSpS )()(
A kind of “quantum-classical correlation”:
x xxxxp AB )( AB:BA I
Zero-discord: 0)( ABA|B
ρx’s are simultaneously diagonalizable ⇒ Holevo’s chi reduces to classical mutual information
Holevo’s chi (2)
For any POVM: I
)(][tr),( xqEyxp xy
xy ypxq
yxpyxpI
)()(
),(ln),(( ) tr[ ] ( , )y
x
p y E p x y
yE : POVM
Holevo’s chi gives an upper bound of the accessible classical information x encoded in ρ
Proof. Straightforwardly obtained from the data processing inequality: Map ρ to a diagonal matrix whose diagonal elements are p(y).
QC-mutual Information (1)
H. J. Groenewold, Int. J. Theor. Phys. 4, 327 (1971). M. Ozawa, J. Math. Phys. 27, 759 (1986). TS and M. Ueda, PRL 100, 080403 (2008).
k
kk SpSI )()(QC
][tr kkk MMp † : probability of obtaining outcome k
†kk
k
k MMp
1
: post-measurement state with outcome k
: measured density operator
Assumed a single Kraus operator for each outcome
Information flow from Quantum system to Classical outcome by quantum measurement
QC-mutual Information (2)
QC0 I H
No information Error-free & classical
Any POVM element is identity operator
Classical measurement, QCI reduces to the classical mutual information
If the measured state is a pure state: 0QC I
Any POVM element is projection and commutable with measured state
k
kk ppH ln
QC-mutual Information (3)
The QC-mutual information gives an upper bound of the accessible classical information x encoded in ρ
F. Buscemi, M. Hayashi, and M.Horodecki, PRL 100, 210504 (2008).
x
xxq )( : an arbitrary decomposition
x are not necessarily orthogonal
A variant (a “dual”) of the Holevo bound Theorem:
QCII
)(][tr),( xqMMkxp xkk † x
kkk kxpMMp ),(][tr †
xk kpxq
kxpkxpI
)(
),(ln),(
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
The Second Law of Thermodynamics (without Feedback)
FW ext
With a cycle, 0F holds, and therefore
0ext W (Kelvin’s principle)
Heat bath (temperature T)
Volume V Work
(the equality is achieved in the quasi-static process)
We extract the work by changing external parameters (volume of the gas, frequency of optical tweezers, …).
Quantum Feedback Control
Quantum system Outcome k
Controller
Control protocol can depend on outcome k after measurement
Quantum measurement
Generalized Second Law with Feedback
Engine Heat bath
F
extWWork
IInformation
Feedback
The upper bound of the work extracted by the demon is bounded by the QC-mutual information.
K. Jacobs, PRA 80, 012322 (2009) J. M. Horowitz & J. M. R. Parrondo, EPL 95, 10005 (2011) D. Abreu & U. Seifert, EPL 94, 10001 (2011) J. M. Horowitz & J. M. R. Parrondo, New J. Phys. 13, 123019 (2011) T. Sagawa & M. Ueda, PRE 85, 021104 (2012) M. Bauer, D. Abreu & U. Seifert, J. Phys. A: Math. Theor. 45, 162001 (2012)
The equality can be achieved:
TS and M. Ueda, PRL 100, 080403 (2008)
QCBext TIkFW
Information Heat Engine Conventional heat engine: Heat → Work
ext L
H H
1W T
eQ T
TH TL Heat engine
QH QL
Wext
Heat efficiency
Carnot cycle
Szilard engine
Information heat engine: Mutual information → Work and Free energy
QCBext TIkFW
Generalized Szilard Engine (1)
Error rates ε0, ε1
0
1
0
1 1
0
11
01 p
1-p
Feedback protocol is determined by (v0, v1)
(v0, v1)=(1,1)
ε0 = ε1 = 0
p=1/2
Szilard engine
T. Sagawa & M. Ueda, PRE 85, 021104 (2012)
Generalized Szilard Engine (2)
Work extraction:
Maximization ( ) and ):
Error rates ε0, ε1
0
1
0
1 10
11
01 p
1-p
0 1( ) ( ) (1 ) ( )I H Y pH p H
ext BW k TI
Langevin System (1)
1 2( )dx
xdt
Overdamped Langevin equation with a harmonic potential
B( ) ( ') 2 ( ')t t k T t t
211 2 2( , , ) ( )
2V x x
Gaussian white noise
A feedback protocol was proposed in D. Abreu and U. Seifert, Europhys Lett. 94, 10001 (2011).
Langevin System (2)
211 2 2( , , ) ( )
2V x x
Step 1: Equilibrium with
1 k
2 0
Step 2: Measurement with Gaussian noise:
B /S k T k
N
: variance of the initial equilibrium
: noise intensity
Step 3: Instantaneous switching to
1 ' (1 / )k k S N
2 ' /( )Sy S N
Step 4: Quasi-static expansion to
1 k 2 ' (unchanged)
Langevin System (3)
N
SI 1ln
2
1
Mutual information
ext BW k TI
Work extraction:
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
Entropy Production in Nonequilibrium Dynamics
System S
Heat bath B
(inverse temperature β)
Heat Q
Entropy production in the total system: QSS SSB
Change in the von Neumann entropy of S
If the initial and the final states are canonical distributions: FWS SB
Free-energy difference
W Work
Second Law of Thermodynamics
A lot of “derivations” have been known (Positivity of the relative entropy, Its monotonicity, Fluctuation theorem & Jarzynski equality, …)
0SB S
If the initial state is the canonical distribution: FW
Holds true for nonequilibrium initial and final states
Entropy production in the whole universe is nonnegative!
System and Memory
Memory (Controller)
System (Working engine)
Information
Feedback
Consider the role of memory explicitly
Entropy Change in System and Memory
System S Memory M
)()()()( SM:MSMSSM ISSS
:MSMSSM ISSS
QISSS :MSMSSMB
Memory Structure
“0” “1”
Symmetric memory
“0” “1”
Asymmetric memory
)(
MM
k
kHH
Total Hilbert space of memory is the direct sum of subspaces corresponding to outcome k
Measurement Process
Initial state: )0(
MSSM )0(
M is on )0(
MH
CPTP map of measurement process: measE
)(
M'k is on
)(
M
kH (projection postulate)
kp : probability of obtaining outcome k
k
kk
kp )(
M
)(
SSM
meas
SM '')(E'
Post-measurement state: Assume
†kk
k
k MMp
S
)(
S
1' :post-measurement state of S
Entropy Change during Measurement
)'()'()'()'( SM:MSMSSM ISSS
After measurement:
Before measurement: )()()( MSSM SSS
k
kSSI )'()'()'( )(
SSSM:MS Mutual information:
)'( SM:MS
meas
M
meas
S
meas
SM ISSS
Back-action of measurement
Second Law for Measurement
MSM:MS
meas
M
meas
S
meas
SMB )'( QISSS
0meas
SMB S
k
k
k SpSSQS )'()'( )(
SS
meas
SM
meas
M
)'( SM:MS
meas
SM
meas
M ISQS
QC-mutual information
QCM
meas
M IQS
Assume: heat is absorbed only by memory during measurement
Energy Cost for Measurement
TS and M. Ueda, PRL 102, 250602 (2009); 106, 189901(E) (2011).
ΔF=0 (symmetric memory) & H=IQC (classical and error-free measurement)
0M W
QCM
meas
M IQS
QCB
meas
MBMM TIkSTkEW
Same bound as that without measurement
Additional energy cost to obtain information
Information is not free!
Feedback Process
CPTP map of feedback process: k
)(
M
)( fb
S
fb PEE kk
Projection super-operator onto )(
M
kH
Use only classical outcome for feedback
Post-feedback state:
k
kk
kp )(
M
)(
SSM
fb
SM ''')'(E''
Assume
Unchanged
]'[E'' )(
S
)( fb
S
)(
S
kkk
Entropy Change during Feedback
)'()'()'()'( SM:MSMSSM ISSS Before feedback:
)''()'( SM:MSSM:MS
fb
S
fb
SM IISS
k
kSSI )'()'()'( )(
SSSM:MS
After feedback: )''()'()''()''( SM:MSMSSM ISSS
Unchanged
Second law for Feedback
0fb
SMB S )''()'( SM:MSSM:MSS
fb
S IIQS
)'( SM:MSS
fb
S IQS
SSM:MSSM:MS
fb
S
fb
SMB )''()'( QIISS
Entropy decrease by feedback
Entire Entropy Change in Engine
)'( SM:MSS
fb
S IQS By adding to the both-hand sides of meas
SS
QCSS IQS
)'( SM:MS
meas
SSS ISQS
During measurement and feedback,
QCBSext TIkFW
QC-mutual information
Generalized Second Law: Summary
Memory M Heat bath B System S Heat bath B
Measurement and feedback
QCSS IQS QCMM IQS
“Duality” between Measurement and Feedback
Time-reversal transformation
Swap system and memory
Measurement becomes feedback (and vice versa)
S M
Feedback
Measurement
Time Correlation
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
What compensates for the entropy decrease here?
Problem
Memory Heat bath System Heat bath
2lnSS QS For Szilard engine,
Conventional Arguments
Erasure process! (From Landauer principle) Bennett
& Landauer
Measurement process!
Brillouin
Widely accepted since 1980’s…
Total Entropy Production
If the quantum mutual information is taken into account, the total entropy production is always nonnegative for each process of measurement or feedback.
0:MSMS QISS
0SMB S
Generalized second laws have been derived from the second law for the total system:
What compensates for the entropy decrease here?
Revisit the Problem
Memory Heat bath System Heat bath
2lnSS QS For Szilard engine,
The quantum mutual information compensates for it.
02ln2lnSSSMB IQSS
Resolution of the “Paradox”
• Maxwell’s demon is consistent with the second law for measurement and feedback processes individually
– The quantum mutual information is the key
• We don’t need the Landauer principle to understanding the consistency
– A misleading argument has been accepted for over 30 years
Outline
• Introduction
• Classical information and entropy
• Quantum information and entropy
• Second law with quantum feedback
• Information thermodynamics
• Paradox of Maxwell’s demon
• Summary
Summary
Unified theory of quantum-information thermodynamics
Minimal energy cost for quantum information processing
Paradox of Maxwell’s demon
Thank you for your attentions!
Theory: T. Sagawa & M. Ueda, Phys. Rev. Lett. 100, 080403 (2008) . T. Sagawa & M. Ueda, Phys. Rev. Lett. 102, 250602 (2009); 106, 189901(E) (2011). T. Sagawa & M. Ueda, Phys. Rev. Lett. 109, 180602 (2012). T. Sagawa & M. Ueda, New Journal of Physics 15, 125012 (2013). Experiment: S. Toyabe, T.Sagawa, M. Ueda, E. Muneyuki, & M. Sano, Nature Physics 6, 988 (2010). J. V. Koski, V. F. Maisi, T. Sagawa, & J. P. Pekola, Phys. Rev. Lett. 113, 030601 (2014). Review: J. M. R. Parrondo, J. M. Horowitz, & T. Sagawa, Nature Physics, Coming soon!