<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Ang Chen's Lab]]></title><description><![CDATA[An optical physicist. I blog about semiconductors, optics, photonics, colors and more topics.]]></description><link>https://angchenlab.com</link><generator>GatsbyJS</generator><lastBuildDate>Tue, 07 Apr 2026 00:11:20 GMT</lastBuildDate><item><title><![CDATA[Colors: Structure, Perception, and Design]]></title><description><![CDATA[How I think about color across physics, materials, and human perception.]]></description><link>https://angchenlab.com/blog/colors/</link><guid isPermaLink="false">https://angchenlab.com/blog/colors/</guid><pubDate>Mon, 09 Mar 2026 12:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Color is one of the easiest topics to make look simple and one of the hardest to treat carefully. A wavelength spectrum exists in the world, but “color” only emerges after materials filter light and a visual system interprets the result.&lt;/p&gt;
&lt;p&gt;That is why I like writing about colors as a bridge topic. It links optics, photonics, materials, and perception without pretending they are the same thing.&lt;/p&gt;
&lt;p&gt;Some useful distinctions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Pigment color comes from absorption and scattering.&lt;/li&gt;
&lt;li&gt;Structural color comes from geometry and interference.&lt;/li&gt;
&lt;li&gt;Perceived color depends on illumination and context.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I am especially interested in the cases where those layers do not align cleanly. A sample may have a well-defined reflectance curve, yet still look different under another source or against another background. That gap between measurement and perception is where the topic becomes interesting.&lt;/p&gt;
&lt;p&gt;That is also why “colors” is a better label here than “coloration”. It reads more naturally, and it keeps the emphasis on the phenomenon itself rather than on a narrower process description.&lt;/p&gt;
&lt;p&gt;When I use this tag on the site, I want it to cover both the physical origin of color and the design choices that shape how people finally see it.&lt;/p&gt;</content:encoded></item><item><title><![CDATA[Optics: Imaging, Refraction, and Systems]]></title><description><![CDATA[A compact overview of how I think about optics as a system-level discipline.]]></description><link>https://angchenlab.com/blog/optics_imaging_refraction_systems/</link><guid isPermaLink="false">https://angchenlab.com/blog/optics_imaging_refraction_systems/</guid><pubDate>Sun, 24 Aug 2025 12:00:00 GMT</pubDate><content:encoded>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;2026 Update&lt;/strong&gt;: this is the start of my personal website!&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Optics is where I usually start when I want to reason about a light-based system at human scale. If photonics is the broad field, optics is often the operational layer: lenses, imaging paths, apertures, aberrations, and the tradeoffs that show up once a real device has to work.&lt;/p&gt;
&lt;p&gt;Three questions tend to keep the discussion honest:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;What field distribution am I sending into the system?&lt;/li&gt;
&lt;li&gt;How does each surface or medium transform it?&lt;/li&gt;
&lt;li&gt;What loss of information am I willing to accept?&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;That framing works for simple refractive setups and for more exotic computational systems too. It forces attention onto resolution, contrast, throughput, and robustness instead of treating “good optics” as a vague aesthetic judgement.&lt;/p&gt;
&lt;p&gt;I like optics because it rewards clean approximations. Geometric optics can get you surprisingly far. Wave optics tells you when the approximation breaks. The interesting engineering happens in the handoff between those two views.&lt;/p&gt;
&lt;p&gt;When I write about optics here, I want those posts to stay close to physical intuition: what bends, what focuses, what blurs, and why the whole system behaves the way it does.&lt;/p&gt;</content:encoded></item><item><title><![CDATA[Photonics: Designing Light with Structure]]></title><description><![CDATA[A short note on how photonics uses structure and materials to control light.]]></description><link>https://angchenlab.com/blog/photonics_designing_light_with_structure/</link><guid isPermaLink="false">https://angchenlab.com/blog/photonics_designing_light_with_structure/</guid><pubDate>Thu, 10 Oct 2024 12:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Photonics is the broad toolbox for guiding, confining, and transforming light. The interesting part is that a device does not need to be “complicated” in the usual mechanical sense to be powerful: often a careful arrangement of geometry, refractive index, and scale is enough.&lt;/p&gt;
&lt;p&gt;The reason I keep coming back to photonics is that it sits at a useful intersection of physics and design:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Maxwell’s equations give the underlying rules.&lt;/li&gt;
&lt;li&gt;Fabrication constraints tell us what is actually buildable.&lt;/li&gt;
&lt;li&gt;Performance targets force us to decide what matters most.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;That combination makes photonics a good home for both intuition and optimization. A waveguide, metasurface, or resonator is never just a shape on a screen; it is a compact argument about how light should behave.&lt;/p&gt;
&lt;p&gt;For me, the appeal is practical as well. Photonics gives you a language for talking about sensing, imaging, communication, filtering, and color in one connected framework. Once that clicks, many seemingly separate problems start looking like variations of the same design question.&lt;/p&gt;
&lt;p&gt;Future posts in this tag will stay focused on that theme: how light responds to structure, and how structure can be chosen with intent.&lt;/p&gt;</content:encoded></item><item><title><![CDATA[More on Optics]]></title><description><![CDATA[More on optics.]]></description><link>https://angchenlab.com/blog/optics_more/</link><guid isPermaLink="false">https://angchenlab.com/blog/optics_more/</guid><pubDate>Wed, 24 Aug 2022 12:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Test optics!&lt;/p&gt;</content:encoded></item><item><title><![CDATA[Inverse Design: Let the Target Drive the Geometry]]></title><description><![CDATA[A brief note on solving for the structure that produces a desired optical response.]]></description><link>https://angchenlab.com/blog/inverse-design/</link><guid isPermaLink="false">https://angchenlab.com/blog/inverse-design/</guid><pubDate>Sun, 15 Mar 2020 12:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Inverse design flips a familiar workflow. Instead of starting with a geometry and asking what it does, you start with a target behavior and ask what geometry could produce it.&lt;/p&gt;
&lt;p&gt;That shift sounds cosmetic, but it changes the whole posture of the problem. Once the desired transmission, reflection, focal profile, or spectral response is explicit, the design task becomes a search over structures under physical and fabrication constraints.&lt;/p&gt;
&lt;p&gt;I like inverse design because it makes tradeoffs visible early. You cannot ask for arbitrary performance without paying for complexity, bandwidth, tolerance, or manufacturability. The optimization process forces those conflicts into the open.&lt;/p&gt;
&lt;p&gt;In practice, the workflow usually looks something like this:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Specify the field or spectrum you want.&lt;/li&gt;
&lt;li&gt;Choose a parameterization for the device.&lt;/li&gt;
&lt;li&gt;Evaluate the forward physics model.&lt;/li&gt;
&lt;li&gt;Update the design until the objective and constraints are both acceptable.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;For this site, I want the inverse design tag to stay centered on that way of thinking: define the outcome first, then work backward toward a structure that earns it.&lt;/p&gt;</content:encoded></item></channel></rss>