Skip to Main Content

Unraveling SARS-CoV-2 Host-Response Heterogeneity through Longitudinal Molecular Subtyping

October 28, 2024
ID
12272

Transcript

  • 00:00It's my our colleague, Hui
  • 00:02Kao from,
  • 00:04applied physics and physics in
  • 00:05in the school of, engineering.
  • 00:07So, Hui, it's, the John
  • 00:09Malone professor of applied physics,
  • 00:12and, her research focuses on
  • 00:14mesoscopic
  • 00:14physics, complex nanophotonics,
  • 00:17she got her PhD in
  • 00:17applied physics from Stanford, and
  • 00:18then,
  • 00:20she was on the faculty
  • 00:20for Northwestern before she joined
  • 00:20Yale,
  • 00:21lucky for us. And, she's
  • 00:23been a a a pioneer
  • 00:26and leader in
  • 00:30on random, working on random
  • 00:32lasers. And and recently, she's
  • 00:34been, working on,
  • 00:36trying to see do deep
  • 00:38tissue imaging, and I think
  • 00:39that's what she's gonna talk
  • 00:40about. And doctor Kao is
  • 00:41an elected member of the
  • 00:42National Academy of Sciences and
  • 00:44also the, American Academy of
  • 00:46Arts and Sciences. So, Huey?
  • 00:54Let me try to see
  • 00:55if I can find it.
  • 00:56Okay. So first, it's a
  • 00:58great pleasure for me to
  • 00:59give this talk. I'd like
  • 01:00to thank John for this
  • 01:01invitation.
  • 01:03I'm really trained as a
  • 01:04physicist, and I'm doing applied
  • 01:05physics engineering.
  • 01:07I know nothing about the
  • 01:08immune system,
  • 01:09but I would love to,
  • 01:10you know, see whether I
  • 01:11can, you know, collaborate with
  • 01:12people here at your medical
  • 01:14school.
  • 01:15So my talk, I'm gonna
  • 01:16start with the introduction about
  • 01:18my my previous work as
  • 01:20I'm trying to show what
  • 01:21we're trying to do right
  • 01:22now. Hopefully, you know, this
  • 01:24will just to give you
  • 01:25some idea what I'm doing,
  • 01:26and I'm very much looking
  • 01:27forward to, you know, talking
  • 01:29with you and see what
  • 01:30are the potential collaboration we
  • 01:31could have.
  • 01:33So,
  • 01:34okay. So this is, actually,
  • 01:36my title. But before that,
  • 01:37as I said, I want
  • 01:38to talk a little bit
  • 01:39of what we have what
  • 01:40I've been doing, especially kind
  • 01:42of you can collaborate with
  • 01:43people at your medical school.
  • 01:45So I'm, doing optics. Right?
  • 01:47So I think for optical
  • 01:49imaging,
  • 01:49what is really important, it
  • 01:51is the light source for
  • 01:52illumination.
  • 01:53And especially for wide field
  • 01:55imaging like a microscope,
  • 01:57we all know that mostly
  • 01:59people using, you know, a
  • 02:00lamp or LUD. Right?
  • 02:03But, those are has a
  • 02:04limited intensity. If you want
  • 02:06to,
  • 02:08tissue or get into some
  • 02:10material has some absorption,
  • 02:12this may not be sufficient,
  • 02:13you know, enough intensity.
  • 02:15So of of course, you
  • 02:16know, people to say, how
  • 02:17about we go to lasers?
  • 02:18Because laser is much, you
  • 02:20know, brighter.
  • 02:21Right? So that is, you
  • 02:23know, really or superluminescent
  • 02:25LED. You have a, you
  • 02:26know, you know, much stronger
  • 02:28signal or much stronger excitation.
  • 02:31But I want to show
  • 02:32there's another thing important in
  • 02:33addition to brightness.
  • 02:35That is the coherence.
  • 02:37So we know that, you
  • 02:38know, you know, laser, it
  • 02:40is coherent, where there's a
  • 02:41LED or lamp is incoherent.
  • 02:43And this coherence actually here
  • 02:45is really hurting us because
  • 02:47that actually can introduce incoherent
  • 02:49artifact, which I'm gonna show
  • 02:50you in particular like a
  • 02:52speckle noise. Maybe some of
  • 02:53you have already seen that.
  • 02:55So what we really ideal
  • 02:57light source
  • 02:58is to, you know, as
  • 03:00bright as a laser, a
  • 03:01standard laser,
  • 03:02but it has a low
  • 03:03coherence. So we don't have
  • 03:05all this coherent artifact
  • 03:06if we we try to
  • 03:08use this as a illumination
  • 03:09source.
  • 03:10So we have been trying
  • 03:11to study how we can
  • 03:12find, you know, a light
  • 03:14source that combine the advantage
  • 03:15of a laser and also
  • 03:17a lamp. Right? So that's
  • 03:19actually we I got into
  • 03:20this random laser,
  • 03:22which can really, produce a,
  • 03:23you know, a speckle free
  • 03:25image.
  • 03:26So here shows example. You
  • 03:28see this kind of a
  • 03:29a speckled image. That's when
  • 03:30we try to image a
  • 03:32a sample, and then there's
  • 03:33some scattering, you know, you
  • 03:35know, layer in between the
  • 03:36sample and the objective lens
  • 03:38and also to the camera.
  • 03:40We are using a laser
  • 03:42as illumination.
  • 03:44You see that we see
  • 03:45all the speckle pattern. We
  • 03:46don't see anything else. That's
  • 03:47really a strong coherent artifact
  • 03:49coming from this, you know,
  • 03:51interference
  • 03:52of this laser.
  • 03:55So we actually develop a
  • 03:56random laser
  • 03:57and that using that as
  • 03:59illumination,
  • 03:59you see that actually, we
  • 04:01can really even we have
  • 04:02a strong scattering we have
  • 04:04significant scattering here, we can
  • 04:05still see this kind of
  • 04:07features.
  • 04:07And this kind of random
  • 04:09laser can be as bright
  • 04:10as a conventional
  • 04:11laser. So that is really,
  • 04:13give us a more powerful
  • 04:14source for speckle free imaging,
  • 04:17especially for full field imaging.
  • 04:20So that is what I
  • 04:21did the first part, try
  • 04:22to get rid of speckle.
  • 04:23But it turns out speckle
  • 04:25is not always bad. Because
  • 04:26even though they cannot take
  • 04:27a allow us to see
  • 04:28the structure, they can allow
  • 04:30allow us to see the
  • 04:31motion.
  • 04:32Because if you see, for
  • 04:33example, if there's something was
  • 04:35moving when they generate speckle
  • 04:36pattern, the speckle pattern will
  • 04:37change.
  • 04:38If you integrate over a
  • 04:39certain time, the speckle pattern
  • 04:41will average out so you
  • 04:43have a low contrast.
  • 04:44So speckle contrast tell us
  • 04:46the motion.
  • 04:47So what would be ideal
  • 04:49is that if we can
  • 04:49switch the laser source, the
  • 04:51coherence,
  • 04:52I can first get into
  • 04:53low spatial coherence. I can
  • 04:55see the structure.
  • 04:56Then I can switch to
  • 04:57high spatial coherence. I can
  • 04:59see the speckle and how
  • 05:00they're moving, how they're changing.
  • 05:01Then I can see the
  • 05:02flow.
  • 05:03So here shows one example,
  • 05:05which I I should collaborate
  • 05:06with professor, Michael Choma and
  • 05:08Mustafa
  • 05:10Koha here at the medical
  • 05:11school. We try to imaging
  • 05:12the, you know, this, you
  • 05:14know, heartbeat
  • 05:15of a a leading tadpole.
  • 05:17So for this kind of
  • 05:18tadpole here, so what we
  • 05:20want, we want to see
  • 05:21how this, you know, the
  • 05:22this heartbeat, what can happen
  • 05:24to the structure.
  • 05:26We also want to see
  • 05:27what is the flow
  • 05:28because this is the animal
  • 05:29mode of heart disease.
  • 05:32So here, this basically show
  • 05:33that we develop a laser.
  • 05:35We can switch very quickly
  • 05:36the spatial coherence back and
  • 05:37forth.
  • 05:38So if I can,
  • 05:41oh, why I cannot,
  • 05:43play the
  • 05:44can somebody help me to
  • 05:45play the video or
  • 05:48I have to play this
  • 05:49okay. Anyway, this video is
  • 05:50not working.
  • 05:53I get okay. No. I
  • 05:54think okay. Let me try
  • 05:55to see what I can
  • 05:56just use in this thing
  • 05:57here too.
  • 05:58Right. Okay. Folder.
  • 06:01Oh,
  • 06:03okay. So
  • 06:06okay. Anyway
  • 06:08so
  • 06:09the unit the the this
  • 06:10thing is not playing.
  • 06:13I have a unit here.
  • 06:21Oh, somehow, I think this
  • 06:22cannot play. I don't think
  • 06:23it's gone. Okay. Anyway, so
  • 06:25that's fine. I think it's
  • 06:27the same.
  • 06:28No problem. Okay.
  • 06:30Yeah.
  • 06:32Yeah. I see maybe when
  • 06:33we,
  • 06:34translate this file, there's some
  • 06:36issue there. So, anyway, you're
  • 06:37supposed to see this heart
  • 06:38is beating. As I can
  • 06:40see, there's a flow of
  • 06:41the blood from one chamber
  • 06:43to another chamber simultaneously.
  • 06:45So this allows to see
  • 06:46simultaneously the structure,
  • 06:48change and also there's a
  • 06:49shape change there. So this
  • 06:51is basically,
  • 06:52allow, you know, them to
  • 06:53really study what is this,
  • 06:55you know, if there's some
  • 06:56disease there, what happens to
  • 06:57the deformation and what, I
  • 06:59mean, what happened to the
  • 07:00flow of blood. Because the
  • 07:02contrast shown as the yellow
  • 07:04here tells you how much
  • 07:04blood in each location.
  • 07:07You can say, well, maybe
  • 07:08I can also do this
  • 07:09with ultrasound. Why you want
  • 07:10to do this with, you
  • 07:11know, light? Because light can
  • 07:13give us much better spatial
  • 07:14resolution.
  • 07:15So, you know, for some
  • 07:16application, this could be useful.
  • 07:20Alright. So that is the
  • 07:21first part we developed given
  • 07:23a light source. We tune
  • 07:24the coherence with different applications
  • 07:26because the reason I have
  • 07:27to say I really, enjoy
  • 07:28collaboration with Michael,
  • 07:30the most offer because they
  • 07:31really tell me what they
  • 07:33need, what kind of light
  • 07:34source they need, and then
  • 07:35this will help me to
  • 07:36really develop this kind of
  • 07:37light source for the application.
  • 07:39So this is very important
  • 07:40for engineer because we have
  • 07:41tools, but we don't know
  • 07:42what this can be useful
  • 07:43for. Right?
  • 07:45So that is the first
  • 07:46part. And now the second
  • 07:47part is that, we also
  • 07:49try to do some computational
  • 07:51imaging.
  • 07:52Meaning that, you know, we
  • 07:53try to do structure illumination.
  • 07:55We try to illuminate, you
  • 07:56know, the sample with some
  • 07:57particular patterns so that, for
  • 07:59example, can give us some
  • 08:01better resolution or give us
  • 08:02some, you know, additional feature
  • 08:04which we can now do
  • 08:04for homogeneous illumination.
  • 08:07So for this, we actually
  • 08:08are using speckle pattern because,
  • 08:10you know, compared to periodic
  • 08:12structure people use for this
  • 08:13kind of structure illumination,
  • 08:15actually, speckle pattern has some
  • 08:17additional advantage.
  • 08:18For example,
  • 08:20vertical,
  • 08:20axial sectioning.
  • 08:22I will not get into
  • 08:23those detail, but what we
  • 08:25did is that we actually
  • 08:26can design the speckle pattern,
  • 08:28not just using, like, a
  • 08:30a a common speckle pattern.
  • 08:31We can design the speckle
  • 08:33pattern for particular, you know,
  • 08:35application,
  • 08:36and then we can really,
  • 08:37use that to enhance the
  • 08:38imaging performance.
  • 08:40So this, again, we actually
  • 08:42collaborate with, professor Jorga Viverstov
  • 08:44here at your medical school
  • 08:46to using this, design spec
  • 08:48pattern
  • 08:49for parallelized nonlinear pattern illumination
  • 08:52microscopy.
  • 08:53This is based on fluorescence
  • 08:55photo switching.
  • 08:57So just show you a
  • 08:58simple example here. So what
  • 09:00we did is, for example,
  • 09:01outside of this region, this
  • 09:03is just a I mean,
  • 09:04illustration with a standard speckle
  • 09:06pattern people use. And in
  • 09:08the central box here, it
  • 09:09is a particular speckle pattern
  • 09:11we designed.
  • 09:12And then, if we're using
  • 09:14this to illuminate this, you
  • 09:16know, protein,
  • 09:17and then for the bright
  • 09:18regime, the protein will switch.
  • 09:20And in the dark regime,
  • 09:22they will not switch, so
  • 09:22they can still fluorescent when
  • 09:24we excite them.
  • 09:25So now you can see
  • 09:26the fluorescence here. Now you
  • 09:28can see a very different
  • 09:29things here. You know, outside
  • 09:31using standard
  • 09:32speckle, illumination,
  • 09:34you can see this kind
  • 09:35of, you know, like, this
  • 09:37kind of,
  • 09:38elongated or this kind of
  • 09:40a network of fluorescence, you
  • 09:42know, from the sample, which
  • 09:44is hard to tell what's
  • 09:45going on. But inside here,
  • 09:47you see the fluorescence coming
  • 09:48from isolated
  • 09:50individual, you know, bright, you
  • 09:52know, I mean, region there.
  • 09:54So this actually coming from
  • 09:55those dark region which we
  • 09:56design in this kind of
  • 09:57speckle pattern, and this actually
  • 09:59can give us a better
  • 10:00resolution.
  • 10:01So we actually show that,
  • 10:03we can break the optical
  • 10:04diffraction limit three times, which
  • 10:06allow us get a three
  • 10:07times better
  • 10:09spatial
  • 10:10resolution.
  • 10:11So this is basically to
  • 10:12you know, we are actually
  • 10:13continue this design in the
  • 10:15speckle pattern for particular applications
  • 10:17so that, you know, to
  • 10:18optimize so that we can
  • 10:20combine with the, you know,
  • 10:22reconstruction
  • 10:23algorithm, which people have developed
  • 10:25so that we can simultaneously
  • 10:27optimize hardware, which is illumination,
  • 10:29and the software, which is
  • 10:31this algorithm to reconstruct the
  • 10:32information.
  • 10:34So, again, we would like
  • 10:35to see what potential application
  • 10:36this could be,
  • 10:38for medical application.
  • 10:40So
  • 10:40the next one, which we
  • 10:43have been studying is, actually,
  • 10:45this is a a deep
  • 10:46tissue imaging.
  • 10:47Like, John just mentioned, we
  • 10:49actually recently got, also,
  • 10:52a phase two grant from
  • 10:53this Charles Zuckerberg initiative
  • 10:55on deep tissue imaging.
  • 10:57And, so our goal is
  • 10:59try to utilize the correlation
  • 11:01engineering
  • 11:02to achieve deep multi photon
  • 11:04microscopy.
  • 11:05And,
  • 11:06our, again, our, you know,
  • 11:09you know, approach
  • 11:10is, again, on the light
  • 11:12source because light source is
  • 11:13really the engine for optical
  • 11:14microscopy.
  • 11:15If you see the last,
  • 11:16I don't know, twenty, thirty
  • 11:17years, all there's a breakthrough
  • 11:19in optical, you know, imaging
  • 11:21is coming from
  • 11:22the drum or the light
  • 11:23source, ultra short, you know,
  • 11:25pulses and other things. So
  • 11:26we want to build a
  • 11:27next generation of the laser
  • 11:29source for better, deeper, and
  • 11:31a gentle microscopy.
  • 11:33So for this one here,
  • 11:34we actually have a team.
  • 11:36We actually have a a
  • 11:38collaborations
  • 11:39from professor Tian Yu Wang
  • 11:40from Boston University and and
  • 11:42also professor from Cornell University
  • 11:45who really is a, you
  • 11:46know, world leader on, you
  • 11:48know, not two photon, three
  • 11:49photon microscopy.
  • 11:51And, also, we are fortunate
  • 11:52to have,
  • 11:53Logan Wright, who's a young
  • 11:54assistant professor, join Yale applied
  • 11:56physics department,
  • 11:58working together to develop this
  • 12:00kind of light source. So
  • 12:01we are engineers and those
  • 12:03you know? So,
  • 12:04what particularly we want to
  • 12:06do, I will not get
  • 12:07into much detail,
  • 12:08is that what we want
  • 12:10to is really, to precisely
  • 12:12tailor the coherent multimodal laser
  • 12:14so that we can maximize
  • 12:16the multi photon absorption efficiency,
  • 12:19in this complex fluorescent molecules.
  • 12:21So, basically, we want to
  • 12:23say what kind of fluorescent
  • 12:24molecules we want to target,
  • 12:26and so we want to
  • 12:27see how we can maximize
  • 12:28this multi photon absorption
  • 12:30by tailoring this illumination source.
  • 12:33Okay? So in some way,
  • 12:34we want to co design
  • 12:35the light source also with
  • 12:36the molecule so that we
  • 12:38can enhance this kind of
  • 12:39three photon,
  • 12:40microscopy
  • 12:41that is mostly for the
  • 12:42tissue imaging, like go to
  • 12:44the brain imaging.
  • 12:46So,
  • 12:47this is, like, a overview
  • 12:49what we have been what
  • 12:50I've been doing so far.
  • 12:51And also, I will now
  • 12:52I want to go to
  • 12:53this, you know, wavefront shaping.
  • 12:55So this is another technique
  • 12:57I wish to introduce to
  • 12:58you, to see whether that
  • 12:59could be applied for immune
  • 13:01system.
  • 13:02So
  • 13:03we know that, you usually,
  • 13:05you know, why we cannot
  • 13:06see deep into a tissue?
  • 13:07Because of the light scattering,
  • 13:08not because the absorption.
  • 13:10So you can see this
  • 13:11actually, you know, if you,
  • 13:13you know, in a foggy
  • 13:14day. Right? If you see,
  • 13:15you know, what's going on,
  • 13:17you know, actually, if you
  • 13:18see something, you know, close
  • 13:20by, that is easy to
  • 13:21see. That's coming from, you
  • 13:22know, single scattering with single
  • 13:24reflection.
  • 13:25Now if you go a
  • 13:26little bit, you know, deeper
  • 13:28in there, you still can
  • 13:30sort of see some, you
  • 13:31know, some something there, but
  • 13:33it's not so easy.
  • 13:34And this kind of actually
  • 13:36signal can be reconstructed,
  • 13:38or, you know, extracted from
  • 13:40optical coherence tomography,
  • 13:41from multifocal, you know, microscopy,
  • 13:44from confocal, you know, microscopy.
  • 13:47So those you you you
  • 13:48can do that, you know,
  • 13:49if you do this in
  • 13:50a tissue.
  • 13:51But if you go even
  • 13:52deeper in,
  • 13:54that become a mission impossible
  • 13:56because you just cannot see
  • 13:57anything. Right? So that is
  • 13:59really what we want. We
  • 14:00wanna go really deeper in,
  • 14:01to see what's going on
  • 14:03there. So, we want to
  • 14:04push this frontier.
  • 14:06So that's why I think
  • 14:07from the imaging to biological
  • 14:09tissue, there's two regimes. Of
  • 14:11course, if there's nothing there,
  • 14:12that's perfect. You just have
  • 14:13a lens you can image
  • 14:14it perfectly. Right? For example,
  • 14:16if you can make your
  • 14:17brain become transparent, then you
  • 14:18can just do that. But
  • 14:20if you, have some, you
  • 14:21know, like a slight aberrations
  • 14:24or, like, the scatterings, you
  • 14:25see this kind of distorted
  • 14:27image, but you still can
  • 14:28guess something out of that.
  • 14:30But what we are really
  • 14:31interested in, you go really
  • 14:32deeper in and then you
  • 14:33see all this kind of
  • 14:34scattering, give you a scattering,
  • 14:36like a speckle pattern. We
  • 14:38just cannot see anything anymore.
  • 14:39Right? But we still want
  • 14:40to see something. So how
  • 14:41can we get you know,
  • 14:42how can we do that?
  • 14:44So, you know, you're not
  • 14:44to really see something deep
  • 14:46inside the way to first
  • 14:47get the light deeper in.
  • 14:49Right? Then we can probe
  • 14:50something.
  • 14:50So the first,
  • 14:52task is whether we can
  • 14:54focus in light through this
  • 14:55strongly scattering, you know, medium.
  • 14:59So because I said, if
  • 15:00you have light going through
  • 15:01a scattering medium, it's become
  • 15:02a really a speckle. Right?
  • 15:03You don't see anything. But
  • 15:05I want still focusing light
  • 15:07to a spot after this
  • 15:09strongly scattered medium. You say,
  • 15:11how can I do that?
  • 15:12Actually, this is basically we
  • 15:13can use in this so
  • 15:14called waveform shaping,
  • 15:16or we can precompensate
  • 15:18the effect of scattering.
  • 15:20So in other words, if
  • 15:21I think about I have
  • 15:22a laser beam, I go
  • 15:23to a specialized modulator.
  • 15:25If I can modulate this
  • 15:27wavefront
  • 15:28in a smart way, and
  • 15:29then when the light is
  • 15:31scattered from different, you know,
  • 15:32paths into this particular
  • 15:35position, they can interfere constructively,
  • 15:37and some they can greatly
  • 15:38enhance the intensity at this
  • 15:40position with which.
  • 15:41So this, you can really
  • 15:43get a light deeper in,
  • 15:44and people have already shown
  • 15:45that.
  • 15:46Actually, this enhancement originally,
  • 15:49become getting to a thousand
  • 15:50times higher. Now, actually, people
  • 15:52can go to one hundred
  • 15:53thousand times stronger
  • 15:55intensity
  • 15:56enhancement. So that you can
  • 15:57really enhance the intensity at
  • 15:58some location.
  • 16:00So now if you have
  • 16:01a way to do that,
  • 16:02to develop to get the
  • 16:04light in there to excite
  • 16:05some molecules,
  • 16:06to do the imaging, we
  • 16:07need to scan these positions.
  • 16:09If we can scan these
  • 16:10positions, now we can get,
  • 16:12we can get the image.
  • 16:13And this one, actually, people
  • 16:14have tried to work using
  • 16:16this for the memory effect.
  • 16:17I don't have time to
  • 16:18get into that. But I
  • 16:19want to show one example
  • 16:21which people try to, not
  • 16:23my group,
  • 16:24but actually, people try to
  • 16:26show that you can, do
  • 16:27this optogenetic control of cell
  • 16:30signaling pathway so there's a
  • 16:32highly scarring, skeleton mouse skull.
  • 16:35So, basically, if you have
  • 16:36a, you know, skull, it
  • 16:37is highly scattering. If you
  • 16:39just send a light through
  • 16:40it, using objective lens, you
  • 16:43just don't get anything. You're
  • 16:44diffused away.
  • 16:46Of course, you can say
  • 16:46we can't use an optical
  • 16:47fiber, which is still really
  • 16:49important, you know, technology,
  • 16:51but the question is this
  • 16:52will be somehow invasive.
  • 16:55So how we can do
  • 16:56it noninvasively,
  • 16:57and that's that's what people
  • 16:59have shown that by shaping
  • 17:00this wavefront,
  • 17:02we can really, focus, you
  • 17:03know, deep inside,
  • 17:05you know, through this scattering
  • 17:07skull. And then by scanning
  • 17:09this spot, you know, you
  • 17:10can get the image there.
  • 17:13But then you say, how
  • 17:14can I know where I
  • 17:15focus the inside? I I
  • 17:16don't even, you know, drill
  • 17:17a hole. How can I
  • 17:18see what is inside?
  • 17:20To do that, we need
  • 17:21a guide star.
  • 17:22A guide star, you know,
  • 17:23can be fluorescence markers,
  • 17:26can be nonlinear optical particles,
  • 17:28can be, you know, some
  • 17:29some people using photo acoustic
  • 17:31feedback
  • 17:32or ultrasound or kinetic guide
  • 17:34stars. So there's many different
  • 17:35type of guide stars people
  • 17:36have been developed
  • 17:38over the last, decade,
  • 17:40try to really help to
  • 17:42focus in light deep into
  • 17:44the system there.
  • 17:46So there has been a
  • 17:47lot of work in there,
  • 17:48so I don't have time
  • 17:49to get into that. But
  • 17:50I think this is basically
  • 17:51a wavefront shaping maybe allow
  • 17:53us to see something deeper
  • 17:54into that.
  • 17:56Then you say, well, fine.
  • 17:57You probably can go somewhere
  • 17:59deeper in, but how but,
  • 18:00you still need to put
  • 18:01something into your system like
  • 18:02a guide star to help
  • 18:03you. Right? But how about
  • 18:05complete noninvasive?
  • 18:07If we really have complete
  • 18:08noninvasive,
  • 18:09there's another technology called a
  • 18:11called a diffusive optical tomography.
  • 18:14So this is basically,
  • 18:16has been widely
  • 18:17used in brain imaging.
  • 18:19Basically, you know, you can
  • 18:21see array of, you know,
  • 18:23laser diodes or LEDs,
  • 18:26placed on, you know, together
  • 18:27with the, you know, these
  • 18:29photo detectors, they are interleaved
  • 18:31and and placed on the
  • 18:33skull.
  • 18:34So, I mean, what happened
  • 18:35to that when the you
  • 18:37know, from a light source,
  • 18:38you can the light can
  • 18:39be injected. This usually is
  • 18:40a near infrared light, which
  • 18:42has less absorption and less
  • 18:44scattering so it can go
  • 18:45through the skull.
  • 18:46And then
  • 18:47get into this brain, and
  • 18:50after all this kind of
  • 18:51scattering inside, a small component
  • 18:53can be really remitted
  • 18:55at some distance away from
  • 18:57the original injection location.
  • 19:00And,
  • 19:01usually,
  • 19:02those kind of remitted light
  • 19:03will carry information deep inside
  • 19:06the brain.
  • 19:07You can see that if
  • 19:08you further increase the distance
  • 19:09between source and the detector,
  • 19:11the the the light had
  • 19:12to go deeper even deeper
  • 19:14in would take information
  • 19:16even from a deeper regime.
  • 19:17That is better.
  • 19:19But what is the, you
  • 19:20know,
  • 19:21difficulty?
  • 19:22The difficulty is the signal
  • 19:23is very weak. If you
  • 19:25want to increase the distance
  • 19:26between source and the detector,
  • 19:28the signal scales as one
  • 19:30over this distance squared.
  • 19:32So if you double the
  • 19:33distance, that means you are
  • 19:34going to, you know, have
  • 19:35your intestine four times weaker.
  • 19:38So what is yeah. So
  • 19:40yeah. So what is our
  • 19:41approach?
  • 19:42Our approach is to shape
  • 19:43the input wave front so
  • 19:45that we can really enhance
  • 19:47the light injection
  • 19:48into this brain,
  • 19:50or the diffuse of media
  • 19:51so so that we can
  • 19:52enhance this remitted signal.
  • 19:54So, basically, we're showing here
  • 19:56that indeed we can actually
  • 19:58enhance this, okay, remission signal,
  • 20:00the, remission signal enhancement
  • 20:02can approach about ten times.
  • 20:05And meanwhile, the sensitivity
  • 20:06of this remitted signal to
  • 20:08the internal absorption change can
  • 20:10also be increased ten times.
  • 20:12So I think my time
  • 20:14is, I mean, getting close.
  • 20:15So let me try to
  • 20:15finish with last thing, and
  • 20:17then I'll be I'll be
  • 20:18done. So, of course, I
  • 20:20said this one, we want
  • 20:20to do non invasive. Non
  • 20:21invasive can only go you
  • 20:22so far. You can know
  • 20:24eventually, if you really wanna
  • 20:25go very deeply in, we
  • 20:26want to go through the,
  • 20:27you know, the fiber using
  • 20:29the endoscope. Right? You you
  • 20:30want to dip in there.
  • 20:32And the the best is
  • 20:33to use multimode fiber because
  • 20:34they have many more spatial
  • 20:36channels, carry more information.
  • 20:37But the multimode fiber also
  • 20:39give you speckle pattern because
  • 20:40of this random mode mixing.
  • 20:42So, again, we can use
  • 20:43a wavefront shaping to focusing
  • 20:45light,
  • 20:46you know, at the distal
  • 20:48end of the fiber by,
  • 20:49you know, using this kind
  • 20:50of same technique, specialized modulator.
  • 20:53In fact, using this kind
  • 20:54of, you know, specialized modulator,
  • 20:56we can control not only
  • 20:57the output, you know, the
  • 20:59at a distal end of
  • 21:00the fiber, what's a spatial
  • 21:01pattern, but also what is
  • 21:02a temporal pulse shape and
  • 21:04also what is a polarization
  • 21:05state. And all this can
  • 21:07be used to probe the,
  • 21:08you know, the molecules or
  • 21:09cells at the other end
  • 21:11of this, monte mode fiber.
  • 21:13So with that, I think
  • 21:14I'll stop here, and thank
  • 21:15you for your attention.
  • 21:22Thank you.
  • 21:23We actually have time for
  • 21:24some questions.
  • 21:37Thank you. So I was
  • 21:38wondering,
  • 21:39in the deep tissue imaging,
  • 21:41how is it calculated
  • 21:42what the correction would be
  • 21:44for the wavefront going in?
  • 21:45I know you don't you
  • 21:46don't want to get too
  • 21:46too much of details, but
  • 21:47how how would you know
  • 21:49how to apply that correction?
  • 21:50So the question is how
  • 21:51do we know what the
  • 21:52front end to send in
  • 21:53to the that's what I
  • 21:54said. We need a guide
  • 21:54star.
  • 21:55If you have a guide
  • 21:56star, it is a fluorescent
  • 21:57marker with some, you know,
  • 21:59a nonlinear particles or even
  • 22:00people using some bubble bursting,
  • 22:02whatever, to give you acoustic
  • 22:04signal. If you have something
  • 22:05in there, then that will
  • 22:06tells you what is the
  • 22:07input wavefront. Indeed, you need
  • 22:08something inside.
  • 22:11On that note, like, do
  • 22:12you actually do you think
  • 22:13you can so back back
  • 22:14to the machine learning aspect
  • 22:15also, can you generate sort
  • 22:17of, like, reasonably random inputs
  • 22:19and actually learn how you
  • 22:20can map it with various,
  • 22:22like, outputs in a in
  • 22:23a control system. Yeah. I
  • 22:24think that's a very good
  • 22:25question. Whether we can use
  • 22:26a machine learning to learn
  • 22:27what is wavefront we need
  • 22:28to do. Right? Yes. I
  • 22:28think people have been trying
  • 22:30that. If you have enough,
  • 22:32you know, data, then you
  • 22:33can really use this to
  • 22:34find out what is a
  • 22:36new wavefront to do focusing
  • 22:37afterwards. But, the thing is
  • 22:39that usually the biological tissue
  • 22:41is moving, so you you
  • 22:42need to learn this very
  • 22:43fast. Mhmm. So that is
  • 22:44usually the typical challenge. Right.
  • 22:46So good feedback is important.
  • 22:47Good feedback is important and
  • 22:48a quick one. Yep. Yep.
  • 22:51Okay. Great. Thanks, Wei.