"The subject of optimization is a fascinating blend of heuristics and rigour, of theory and experiment."
Announcements
★★★ 2025/01/09 ★★★ This course will be taught as an EMI (English as a Medium of Instruction) course, in alignment with our government's policy to promote bilingual education. The lectures will be conducted in English (approximately 75%) and Mandarin (approximately 25%) to make the content more accessible and help ease any language barriers.
感謝大家這學期的辛苦參與~祝福各位未來旅途順遂,充滿勇氣面對各式比數學更難的挑戰 :)
2025.06.12攝於期末考後
大家留給學弟妹們的建議可在這裡看到,謝謝大家對老師的鼓勵和肯定,老師也希望學弟妹們能因為大家建議的方法而學得更紮實,讓中正通訊變得更好 ^_^
原本這個問題的回覆是不會公布在網路上,但老師發現裡面其實有蠻多有趣的點可以彼此交流或給學弟妹們參考,故放在這裡讓大家讀讀,這個問題的題目是「如果這門課能重新開始,我希望怎麼學習?會不會後悔選這門課?又或者老師應該怎麼搭配才能讓自己學得好?」
有點後悔自己數學不好也沒有花比別人多時間學習,到了學期尾聲才弄懂前面的東西,才發現阿,原來這就是這樣其實挺有趣的,當時怎麼就不會呢? 這樣的感想。
希望能一直都有認真整理筆記,搭配課本閱讀學習。在這堂課當中收穫許多,老師在課程中提供的建議也相當有幫助,過去很少有老師願意分享怎麼學習或是如何做筆記。
我在課程的前半段應該就要養成提早寫完作業、認真整理筆記的習慣,幫助自己更穩固前面一部分的內容,這樣老師進度推到下一部分時就能在聽講同時更多時間思考及反思,而不是花時間在對照跟複習前一部分的內容
我覺得很難,但是還是學得很開心
我會多去看課本的內容,筆記和課本和範例搭在一起就很清楚了。不會後悔選擇這堂課。
我會選擇先看過一次課本內容,再聽老師上課講解。我認為這樣做在接觸這些比較困難的東西會比較容易去接受它。不會後悔,不過希望老師能夠儘量跟著課本上的notation,這樣複習起來對照上課筆記會比較輕鬆。
我會想要在期末考範圍的內容多下點功夫。完全不會後悔
我希望我自己在作筆記之餘,也多寫一些習題,加深印象,程式作業不錯,可以感受到實際研究應用的場景。
如果再來一次,我應該會改變寫筆記的方式,我應該比較適合直接把筆記寫在原文書上,所以上課時我應該是先大概寫老師提到的或不懂的,然後再把他寫到原文書裡那段,因為我自己用iPAD寫又會比較慢又醜,所以上課時就沒寫在原文書PDF上了。 不會後悔選這門課。
多思考一些想法,得知用途或者跟先前學過的有什麼差異,且從證明得知想法,理解其應用或由來,對各函式有深入得探討。不會,受益良多。
期中考前的部分,有些地方是偏比較抽象的,因為筆記教授有說盡量用自己的話去理解,也有嘗試去這麼做,但不確定是我都自圓其說還是怎麼樣的,寫題目或是考試分數都不是很理想 期中考後因為還有研究相關的,確實分配給這堂課的時間有減少,所以現在學期成績有點緊張,可以理解教授在學期初說的不推薦研究生來修,但是沒有後悔選這門課,有感受到教授對於授課的熱忱.目前還在設想要是加入最佳化在論文中可以怎麼做.
我不後悔選這門課,也認為自己的學習步調搭配老師的節奏剛剛好
如果可以重新開始,我會希望自己再多花點時間去理解原理吧,就是感覺自己想掌握所有時間結果到後面就軟爛掉感覺自己很累累,就放在那邊,我原本是給自己一個目標是說上課寫的筆記然後下課回家可以直接複習,結果這方法被自己的懶惰給取消掉,可能是覺得自己在期中的成績還可以就覺得後面可以輕鬆點,但看起來不能這樣,畢竟知識是環環相扣,少掉一節等於很多地方沒上,如果再來一次我可能會傾向於先拍照起來,然後上課跟著標記出重點,就不要盲目地抄筆記,回家之後也可以隨著再重新寫一次筆記重新梳理。後悔是不會我本來就是為了老師選這門課再加上這是實驗室規定的必修,但如果以上兩點沒有我可能還真的不會選這門課,我覺得翁老師上課很有熱忱並且會好好地跟學生對話或是花時間在學生身上,但有些知識性的問題或是腦容量的問題就不是老師可以太參手XDD,我覺得老師有發測驗卷讓我們回家寫是一件不錯的事情,至少考試前不會盲目地讀會有個方向,老師的考試也不是為了考倒學生而是真的希望學生學到知識。
希望自己有多一點時間的話可以再多看一點課本、不會後悔、有學到東西
希望可以提早研讀這本原文書,這樣才會可以讓自己更聽懂老師在說什麼,並且知道哪個部分是老師會的技巧,哪些是我不會的,這樣才不會每次都看的非常趕。
我可能需要再自己推導過幾遍才能夠完整歸納想法概念,老師教學上沒有問題。
不會後悔選這門課 每週整理筆記真的是很大的重點,只要一週沒整理下禮拜就會完全忘記了。
不會後悔,因為覺得是很充實的一堂課。我希望的學習方法跟這學期學習方法差不多,就是上課聽講、回家後整理筆記。如果可以的話希望老師可以在課後放上自己整理的筆記,因為有時候板書不清楚或是在抄筆記的時候老師有講其他東西,就會讓板書看起來比較亂,但板書整體還是很好的呦!
希望在每個新章節開始之前,先熟悉該節的數學符號或定義,這樣老師在講解的時候能更快同步。不會後悔。
不會後悔,但是希望可以上完課的時候,一邊看課本然後一邊看老師的筆記,但是這學期不知道為什麼一直生病,所以有點心有餘而力不足。
這門課學到了很豐富充實的內容,幫我們鋪好基礎知識,讓我們可以在需要的時候讀更進階的內容,超級推薦優質課程還有優秀的健家老師!
我只希望能有更多時間好好讀讀這本課還有吸收老師上課所講的。並不會後悔選這門課,我反而覺得學習到很多,也讓我有更深的了解學習到我之前數學系學到的線性規劃導論為何當初會是這樣,有種恍然大悟的感覺。我會有點希望老師能把證明的部分寫起來然後放到ecourse上,這樣感覺吸收會比較好一點,畢竟有時候上課邊想邊抄,回去有時候會覺得有點雜亂,對於我這種比較不會做筆記的人而言,可能會是很大的幫助。
回饋後記:同學常見的學習困難包括細節推導與觀念剖析容易在做筆記時遺漏,進而卡關,或者把別人的講解抄寫下來容易,但將內容拆解、消化、轉換成自己的筆記則有困難。遇到這些挑戰並不表示你落後,而是因為你在嘗試更深入思考。若您是過程中感覺不到這些變化的同學,不妨多看看同學怎麼做或和老師討論,很多時候不是您做不到,而是您的生活被時間追著跑導致靜不下心覺察。另有不少從同反應從原本只是抄寫,轉而能寫下自己的理解、例子或疑點,這是很正向的轉變。重點是撐過那種「只想快速抄下」的衝動,換成「慢慢思考再寫下」的習慣,就是成長,我想有些同學是知道老師在說些什麽的。
希望大家能理解,學習過程中的掙扎與頓悟是一很特殊的過程,只要堅持走下去,就是真正屬於你的收穫,不要被名利遮蔽你的眼,大家都是獨立的個體,也有各自的人生課題。祝福大家未來順利!
Instructor: Jian-Jia Weng
Time: Tuesdays and Thursdays, 10:15-11:30
Location: R101, Innovation Building
Office Hour: Upon Request
Teaching Method: Chalkboard Teaching with Video Recording Supplementary
Textbook: E. K. P. Chong, W.-S. Lu, and S. H. Zak, An Introduction to Optimization with Applications to Machine Learning (5ed) https://www.engr.colostate.edu/~echong/book5/
Grade Evaluation: 4 Homeworks (each 10%) + Midterm Exam (20%) + Final Exam (20%) + Notes&In-Class Participation (20%) [Updated@Feb. 18, 2025]
Office: R428, Innovation Building (please make an appointment before coming)
Campus Internal Phone Number:33528
Email: jjweng AT ccu.edu.tw
If you have any questions regarding the course, you can email me from your school email account with:
Subject: [OPT 2025] Inquiry - Your name and Student ID number (example: [OPT 2025] Inquiry - 周杰倫 1234567)
Contents: (1) topics you want to discuss and (2) your preferred time to meet in person (please specify at least 3 time slots)
for a special accommodation. I should reply to your email within 24hours; if not, please send the email again.
Week 1 (02/18, 02/20): Section 2-3
Introduction to Optimization
Mathematical Reviews
Linear Algebra: Vector Spaces, Matrices, Basis, Linear Transformations, Inner Product, Norms, Eigenvalues, Eigenvectors, Orthogonal Projections, Quadratic Forms
Week 2 (02/25, 02/27): Section 3-5 Mathematical Reviews (Cont'd)
Concepts from Geometry: Line Segments, Hyperplanes, and Linear Varieties, Neighborhoods, Polytopes, and Polyhedra
Differential Calculus: Sequence and Limits
Week 3 (03/04, 03/06): Section 5, Mathematical Reviews (Cont'd)
Differential Calculus: Differentiability, Gradients, the Derivative Matrix (a.k.a. Jacobian Matrix), Differentiation Rules, and Level Sets, Taylor Series
Week 4 (03/11, 03/13): Section 6
Local Optimizers/Global Optimizers
Directional Derivative and Feasible Directions
Sufficient and Necessary Conditions for the Optimality
First and Second Order Necessary Conditions
Second Order Sufficient Conditions for Strict Local Minimizers
Week 5 (03/18, 03/20): Section 7, 8.1-8.2
One-dimensional Search Methods
Golden Section Search
Fibonacci Method
Bisection Method
Newton Method
Secant Method
Bracketing
Gradient Descent Algorithm
Steepest Descent Method
Week 6 (03/25, 03/27): Sections 9.1, 9.3, 10
Newton Method for Multivariate Function
Levenberg-Marqudrdt Modification
Geometric Interpretation of Steepest Gradient Descent and Newton Method
Introduction to Conjugate Direction Methods
Week 7 (04/01, 04/03): Section 10/A video recording has been released on eCourse2
Basic Conjugate Direction Algorithm
Conjugate Gradient Algorithm
Week 8 (04/08, 04/10): Section 11/A video recording will be released on eCourse2
Introduction to Quasi-Newton Methods
Approximating the Inverse Hessian
Rank One Correction Formula
DFP and BFGS Algorithms
Week 9 (04/15, 04/17): Section 15, Official Midterm Exam Week (4/17 Midterm!!!)
Introduction to Linear Program
Week 10 (04/22, 04/24): Section 16
Basic and Feasible Solutions
Simplex Method
Week 11 (04/29, 05/01): Sections 17
Matrix Form of Simplex Method
Two-Phase Simplex Method
Revised Simplex Method
Duality for Linear Program
Week 12 (05/06, 05/08): Sections 17 and 20
Duality Theorems
Introduction to Problems with Equality Constraints
Tangent and Normal Spaces
Week 13 (05/13, 05/15): Section 21
Lagrange Condition
Second Order Conditions
Minimizing Quadratics subject to Linear Constraints
Week 14 (05/20, 05/22): Sections 22.1-22.3
Introduction to Problems with Inequality Constraints
Karush-Kuhn-Tucker (KKT) Condition
Second Order Conditions
Week 15 (05/27, 05/29): Sections 23.1-23.4
Introduction to Convex Optimization
Optimality Conditions
Week 16 (06/03, 06/05): Section 23.4-23.5, Official Final Exam Week
Introduction to Lagrangian Duality
Primal-Dual Pair
General Duality Properties
Minimax Inequality Chain
Optimality of Saddle Point
Week 17 (06/10, 06/12): Section 24 & 6/12 Final Exam!!!
Weak Duality and Duality Gap
Strong Duality
Introduction to Algorithms for Constrained Optimization
Project Gradient Methods with Linear Constraints
Lagrangian Algorithm
Penalty Methods
Week 18 (06/17, 06/19): No Class
You need to submit your answers on eCourse2, although they are not marked anyway. You are encouraged to work them together with your friends. Discussion is a good way for learning a new subject.
1.4, 1.7, 1.11
2.3, 2.9, 2.10, 2.11
3.1, 3.3, 3.6, 3.15, 3.18, 3.21
4.1, 4.8, 4.11, 4.12
5.4, 5.5, 5.7, 5.11 (read 5.8-5.9 and solve them if you like)
6.1, 6.8, 6.9, 6.11, 6.18, 6.20, 6.29 (read 6.3-6.4, 6.7, 6.33 and solve them if you like)
7.4, 7.8, 7.9
8.1, 8.8, 8.12, 8.15, 8.16, 8.18, 8.24
9.1, 9.4, 9.8
10.1, 10.2 10.4, 10.6, 10.10
11.1, 11.3
15.1, 15.2, 15.3, 15.4, 15.10
16.2, 16.4, 16.6, 16.18, 16.19
17.1, 17.2, 17.3, 17.7, 17.12, 17.21, 17.24
20.1, 20.2, 20.3, 20.17, 20.18
21.1, 21.2, 21.6, 21.10, 21.12, 21.14, 21.21
22.1, 22.2, 22.8, 22.12, 22.18
S. Boyd and L. Vandenberghe, Convex Optimization. Available here
D. G. Luenberger, Optimization by Vector Space Methods. Available here
S. Bubeck, Convex Optimization: Algorithms and Complexity. Available: arXiv1405.4980v2
Polytope and Polyhedron [Wiki]; also see this lecture note from Cornell Univ. and this one from Illinois.
In certain fields of mathematics, the terms "polytope" and "polyhedron" are used in a different sense: a polyhedron is the generic object in any dimension (referred to as polytope in this article) and polytope means a bounded polyhedron. This terminology is typically confined to polytopes and polyhedra that are convex. With this terminology, a convex polyhedron is the intersection of a finite number of halfspaces and is defined by its sides while a convex polytope is the convex hull of a finite number of points and is defined by its vertices.
Change-of-Basis [Wiki]
Jonathan Richard Shewchuk, An Introduction to the Conjugate Gradient Method Without the Agonizing Pain Edition 1 1/4, 1994.
As we saw in class an equivalence between solving Ax=b and minimizing 1/2 x^TAx - b^Tx, many iterative algorithms developed to solve linear systems can be transformed into procedures to find the minimizers of the quadratic function. The following materials covers fundamental results on how to solve Ax=b numerically, particularly the Krylov subspace methods. Such methods are highly related to the conjugate gradient (CG) methods in Section 10 of our textbook.
Henk A. van der Vorst, Iterative Krylov Methods for Large Linear Systems, Cambridge University Press, Cambridge, 2003.
R. Barrett, M. Berry, T. F. Chan, J. Demmel, J. Donato, J. Dongarra, V. Eijkhout, R. Pozo, C. Romine, and H. Van der Vorst, Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods, 2nd Edition, SIAM, Philadelphia, PA, 1994.
T. Sogabe, Krylov Subspace Methods for Linear Systems: Principles of Algorithms, Springer, 2022
Y. Saad, Iterative Methods for Sparse Linear Systems, 2nd Edition, SIAM, 2003.
W. Hackbusch, Iterative Solution of Large Sparse Systems of Equations, Springer, 2016.
A Brief Introduction to Krylov Space Methods for Solving Linear Systems from ETHZ
Gerand Sleijpen, A Course on Numerical Linear Algebra
Marc Bonnet, Lecture Notes in Numerical Linear Algebra
G. Foschini, R. Gitlin and S. Weinstein, "Optimization of two-dimensional signal constellations in the presence of Gaussian noise," IEEE Trans. Commun., vol. 22, no. 1, pp. 28-38, Jan. 1974, doi: 10.1109/TCOM.1974.1092061.
M. Beko and R. Dinis, "Designing good multi-dimensional constellations," IEEE Wireless Commun. Lett., vol. 1, no. 3, pp. 221-224, Jun. 2012, doi: 10.1109/WCL.2012.032312.120203
J. Feldman, M. J. Wainwright and D. R. Karger, "Using linear programming to Decode Binary linear codes," IEEE Trans. Inf. Theory, vol. 51, no. 3, pp. 954-972, March 2005, doi: 10.1109/TIT.2004.842696.
T. Cui, T. Ho and C. Tellambura, "Linear Programming Detection and Decoding for MIMO Systems," in Proc. IEEE ISIT, Seattle, WA, USA, 2006, pp. 1783-1787, doi: 10.1109/ISIT.2006.261741.
T. Wadayama and M. Hagiwara, "LP-Decodable Permutation Codes Based on Linearly Constrained Permutation Matrices," IEEE Trans. on Inf. Theory, vol. 58, no. 8, pp. 5454-5470, Aug. 2012, doi: 10.1109/TIT.2012.2196253.
陳俞閔
黨嘉鴻
姚宇翰
陳若語
黃少駿
蘇彥禎
楊崑稜
楊文棟
廖鈺馨
陳俊綜