"I have found it!"
In a captivating scene, Archimedes swiftly emerges from his tub, water droplets still clinging to him, as he dashes through the ancient streets of Syracuse. Propelled by the sheer weight of his compelling revelation, the authenticity of the golden crown seems inconsequential in comparison to the triumph of timely knowledge. It was a breakthrough of such magnitude that he couldn’t keep it to himself for even a moment longer. And, of course, a more precise explanation for the intricacies in fluid dynamics was ample justification to dismiss any ridicule directed at his vivacity at that moment.
Fact or tale, this narrative of the exuberant mathematician’s story is one that has dealt me an enduring impression since childhood. In those days when I wholeheartedly embraced the belief that "if you can think it, you can be 'it,'" I was captivated by the idea that I, too, could alter the course of history, much like the inspiring figure in this remarkable story.
Fun times.
My husband and I occasionally share a pastime—recounting inventions from centuries ago that are still profoundly relevant today. This usually ends with him wondering if there are any new inventions in our generation, while I secretly rack my brain to recall the most recent ingenious, freshest discovery I've stumbled upon.
In the present era, I figure it is a tad more difficult for scientists to make entirely novel discoveries in science and technology. Many breakthroughs in recent times have depended upon previous explorations in their respective fields of endeavour. The development of the modern computer, for example, has evolved from the days of Charles Babbage and Ada Lovelace to the emergence of artificial intelligence (AI) and quantum computing.
In healthcare, gamuts and more rounds of research have generated itchy spots begging for answers, uncovering fresh perspectives, subtleties, and limitations; in turn, necessitating further research. Yet, no matter how innovative, much is mostly recycled gist, at best oiling an already spinning wheel. In my budding scholarly ambition, I've frequently delved into numerous studies that build upon previous research. However, I've noticed that the new themes proffered often represent a single nuance, or revolve around matters of semantics. It bugs me because I don't typically enjoy soft, subtle changes.
Now my curious frustration.
Invention, innovation, or neither, it occurs to me that humans, myself included, possess an insatiable need to share information, gossip, and newly acquired knowledge. We tend to add an extra flair to sweeten the recycled gist to give ourselves an extra appeal, so much so that it is now okay to embellish and even falsify information. Call it memory distortion, confirmation bias, or just plain attention seeking, social psychology presents it better.
What I'm trying to express is that, despite feeling easily overwhelmed by public interactions, there is certainly a part of me that yearns to share a piece of me with the world. Simultaneously, there's a recognition that external validation, even from strangers isn't entirely unwelcome. Right? Take, for instance, the inception of this blog-initially created as a personal repository of my reflections, indifferent to whether anyone would be interested in them. My plan was to allow the tide of coincidence to carry my entries to anyone who happens to stumble upon them. However, as fates would have it, I began actively sharing the link across my social media platforms and enabling comments from the very first post. The same pattern emerged years ago with some of the poems I've shared on Instagram. Despite any pretence, the likes and comments undeniably contribute to a genuine sense of satisfaction and self-worth.
I thoroughly enjoy immersing myself in the gazillion possibilities on the internet and in books, and one of my chief hobbies is reading about people. In our sweet digital age, with each blue or airglow sky, a new story or piece of information online unfolds. It's truly fantastic. I guiltily imagine my life in the spotlight, swooning in the ocean of a million validations. Nah. That's a joke.
Fortunately, we no longer have to run through the streets of Syracuse to announce our 'eurekas'. While books may be gradually phasing out, we've never run out of ways to declare our tipping points. Social media has significantly lowered the communication threshold and brought us closer than ever before. Yet, the baggage and price we have to pay for this convenience are a gift that keeps on giving.
New idea. Fresh story. Innovative tactic. Novel perspective. The good. The irksome. Glory, glory! I love new things! Unfortunately for me, the manner in which this 'new knowledge' is disseminated these days bothers me.
I find it profoundly unsettling when individuals adopt a smug approach in conveying messages to an uninformed or under-informed audience. The arrogance is unwarranted and inexcusable, I suppose. It becomes even more infuriating when this flawed communicative tone is perpetuated by individuals who are still in the growth curve, yet to find their footing in their respective fields. Isn't it concerning that someone who has recently learnt a new technique, perhaps from watching a vlog, attending a masterclass, or reading a book, can be so condescending simply because they are one or two steps ahead of others?
Let’s consider it a newfound conviction, a shift in mindset, or a fresh perspective on an issue. Why must we resort to snide or sarcastic remarks when sharing our discoveries, as if implying we were born with the knowledge? I believe the foundation, permutations, and the like, behind convictions require time to fully accumulate and consolidate. So, why is it challenging to extend the same grace to others, allowing them to attain our level of enlightenment, a commodity we freely afforded ourselves while learning about those new concepts?
Health education.
On health education, I have often wondered: Is there a way to convey important health messages on social media, particularly on X and Instagram, without inadvertently resorting to snide remarks, reverse psychology, and sarcasm? Humour is great, but being an expert (sometimes not) on a subject matter doesn't justify ridiculing others for their limited knowledge. With over six years spent in medical school, it's unrealistic to expect non-doctors or my patients to possess the same depth of knowledge. At the same time, I shouldn't underestimate them or dismiss their sensibilities when attempting to impart valuable information.
Furthermore, what I know is merely a fraction compared to the vast knowledge I'm yet to acquire, even within my field. If I can't offer a service or provide informal consultation, I should be able to respectfully articulate my reasons and guide people to the appropriate resources for assistance.
What puzzles me is the feigned shock or genuine horror on people's faces when "you guys" don't know what they know.
Same story everywhere.
About 5-6 years ago, an old acquaintance of mine became obsessed with cryptocurrency. He boldly declared on social media that he was determined to sever ties with friends who refused to embrace buying bitcoins, as he believed that anyone not on board with the trend would undermine his identity as an intelligent person. He firmly believed that cryptocurrency was the future, destined to replace fiat in an impending apocalypse. Sadly, he stayed true to his word, and what could have blossomed into a meaningful friendship deteriorated - all because of a trend he was fortunate enough to have caught onto before others.
Part of what I’m saying is that we shouldn’t have to vilify ignorance or make everyone else uncomfortable while conveying a message. Education or information should be a solution to ignorance, and recognising that others may genuinely face barriers in attaining our 'wealth' of knowledge is another solution. Being insensitive to this is another kind of ignorance. The fact that we have stumbled upon new information doesn’t mean others haven’t, neither does it pause their journey towards it. Since we need to share our knowledge and experiences, it’s feasible to tone down the haughtiness and presumption and simply dispatch the memo. After all, there's only so much information humans can swallow at a time.
Is it graspable that a solitary, monochromatic, and asymmetrical shape commands a value running in the millions?
Once, I found myself gazing at multiple lines of what appeared to be unintelligible scribbling, perplexed as to how it had gotten past a first glance, let alone evoking mushiness from the enthusiasts.
I get it, it's cute, but why does it merit such a hefty price tag, costing not only my arms and legs but also those of my neighbours?
...5673 more ramblings...
Some modest clarity, eventually:
…well, plus a bit of ongoing education in the underlying microeconomics of auctions.
Little addendum.
About a year ago, I attended an immersive art experience featuring Vincent van Gogh. After a near-baptismal experience, it occurred to me why some of his pieces were auctioned for a couple of million dollars. It sounded ludicrous at first, however, I came to understand that beyond the appreciation of his talent, his works are profoundly stirring and communicative. It taught me one thing: art bears (and bares) a soul. One breathed into it by its gifted creator. It did not come to me naturally, but with open-mindedness, I could see it too. The “face card” might neither be striking, nor the beauty stark, but in lingering a little longer, I think I can attempt to muster the emotional resonance that comes with the appreciation of art, even art that's not my "spec" (if I have any).
Pretty much applicable to humans, I believe.
There'd probably be no need to label anyone as beautiful, or unremarkable if we all had the same appeal. But thanks to the innumerable self-love activists and the propagation of the agenda that everyone is bodily beautiful, people these days seem to emphasise their satisfaction with exactly how they look. Unfortunately, no matter how much I'd like to jump on the wagon, not everyone is visibly endearing at first or even bold enough to corroborate the claim for that matter. It's a natural bias or flaw, as the case may be.
Obvious beauty, no matter how charming, is unlikely to keep anyone connected for the long haul. Even beauty pageants and supermodels have to prove themselves beyond their good looks. Some people may look rough on the outside, and may never come to have physical allure, but that doesn't mean they aren't bright gems.
Whenever I catch me gunning for beauty or 'posh’ in people, I recall that I never envisioned myself as physically appealing, or at least, I didn't consider myself a catch until I turned fifteen. By a stroke of luck, the handsome boy I liked—whom I believed wouldn’t notice someone as smallish as me—liked me back. However, even my blemished perception of myself struggled in the sheer shallowness of that superficial charm. I grew too scared to stick around until 'we' cultivated any depth, so I quickly ran away. I don’t claim to have as much substance as I’m yearning for, but I’ve made a few good choices in my life, my husband being one of them.
This realisation has redefined my appreciation of beauty: substance over mere aesthetics. As far as meaning goes, I could go deeper. While I value aesthetics, I hope my appreciation delves much deeper into the more valuable realms of meaning. This reflection brings to mind the analogy of books; the covers may be visually pleasing, yet they often wear and tear long before the pages that hold the substance of the stories. They're hardly more important than the pages they house.
Continuously, I strive to train my eyes to discern the intrinsic sweetness in people—a quality they have more control over than their looks. Simultaneously, I extend the hope that others make a conscious effort to do the same when looking at me.
I'm grateful for the subjectivity of beauty; nonetheless I'd like to reach further, hoping to connect with the good in others that may not be immediately obvious.
Oh, and knowing when to stop looking.
]]>
The Earth might have taken a detour, and maybe it now takes a leisurely 12 hours to complete a day's spin on its axis...
Well, I recently discovered that we're currently in 2024, yet I can't seem to comprehend that we actually spent all 365 days of 2023. Within the sanctuary of my imagination, the Earth is spinning out of control, and it's daunting to keep pace.
To buttress my dazed consciousness of the world, I was visited by an intriguing dream on the 1st of January, and quite accurately, a new year had just unfolded as well. Not 2024, though. I was transported back to the starting line of 2023. All of 2023 was summed up into a mere déjà vu. Of course, there's zero incentive to relive 2023 if given the chance, but I hadn't quite amassed any excitement for 2024 either. When I finally woke up, I quickly cross-verified with Alexa—bypassing my other devices—just to ensure my entire 2023 wasn't some otherworldly joke. I could rely on Alexa's voice, annoying as it may be. As expected, she jolted me back to reality, as unready as I could be.
Playing catch-up...
One of my hardest struggles is with catching up—trends, news, and the constant influx of information, solicited or not. I'm not particularly a fast or eager learner, so processing this overload and distinguishing gems from junk with finesse is no casual task. While attempting to comprehend one batch load of concepts, another assortment of opinions is lined up on the other end. How hard is it to belong in this era?
Dumping here the bit of my confusion that I managed to weave into words...
I'm in awe of those individuals who possess a little more knowledge about almost anything but effortlessly stay updated on some of the 'buzzwords' of our time, such as AI, crypto, the growing climate concerns, social justice, gender identity, vaccine hesitancy, political polarisation, mental ill-health, and whatnot. Me? I am drowning! In my struggle to keep up, I agree it's beneficial to actively seek knowledge about such issues and possibly formulate opinions—enough to lean toward a stance amidst the myriad arguments they present. I think I understand up to the point that it shouldn't always be an 'us versus them' affair all the time. Aiming to be objectively aware, without the pressure of having to pick a side, should equally be another navigable road. Hence, for the controversial issues, I find it disorienting that genuine confusion or any semblance of neutrality is labelled the 'enemy of progress'.
Can't I simply acknowledge an ongoing debate or misunderstanding and leave it at that?
Recently, I watched a show online where opposing parties were asked a range of questions and expected to pick a side. One segment focused on "Flat Earthers" versus "Globe Earthers". Virtually all the speakers on each side made points suggesting that religion was in total opposition to science, which left me confused and disheartened. I’m inclined to ask if both can coexist and still be true, as I'm a Christian who believes and loves science. Disappointingly, the argument concluded on the note that religion is non-scientific, and scientists are atheists. Another segment explored feminists versus non-feminists. Both sides seemed to agree with many points raised by the opponents, yet they compulsively preferred to stick to their guns. I’m not sure what side of the argument I subscribe to the most but I keep wondering why they couldn't acknowledge that they were narrating the same story using different words or recognise they had a middle ground. To kill the glimmer of hope I was still nursing, one of the participants stated in pretty clear terms that anyone who didn't subscribe to her position on the matter was a something derogatory.
It's a fascinating era of pinning opinions, pegging ideologies, clipping convictions, and sometimes attempting to simplify (and complicate) issues. We have lovely prefixes and suffixes attached to concepts that probably haven't fully unfolded, causing those yet to make up their minds to tiptoe around collective social issues. So much for this chapter of our lives that prides itself on free speech. Well, yeah, each to their own truth, whether relative or absolute. It’s only a question of who has more compelling evidence or audience, or both.
In making a case for indecision here, I acknowledge that some individuals choose to straddle the fence, allowing them to enjoy the benefits from both sides without sharing in any of the liabilities or moral obligations that having a strong opinion shoulders. If it's not life-threatening, neutrality may as well be a blessing. I may not have all my points, but I believe understanding exists in a continuum. Can grace be extended to those who might take it all in a little slower, and perhaps need some more time or facts to come to the realisation of their stance? How about not picking a side at all? Remaining perpetually confused?
]]>
“Am I awkward?”
My mind has compulsively mulled over this poser between two schoolgirls on a cold uneventful evening. I could detect a hint or two of discontent in the tone of the asker’s voice as I strolled away.
Of course, I don't have the accurate context, but as a retired teenager, I could recite a dozen plausible grounds for that question. I have arrived at a few crossroads myself that have necessitated different iterations of the self-scrutiny. Whatever the case, the question sparked a ‘tink’ in my mind, uncovering a pattern I have observed, and that has recurrently confounded me.
In my teens, awkwardness, or weirdness, or strangeness was an ugly shadow cast upon the unfortunate, and incompatible with the celebrated standard of ‘cool’. The universal albeit nuanced definition of ‘cool’ was pretty clear to us then: be social, be part of the louder school clubs like the debate society, theatre, and entertainment clubs, or engage in sports or politics. Most of the other ventures were considered pale, woefully lacking in flavour or the appeal of the consensus. In every sense, being cool was synonymous with being acceptable, less upsetting, and not weird. Despite my characteristic tininess, I was, fortunately, part of the ‘cool team’ because my extracurricular interests in secondary school aligned with the branding of coolness. Luckily, I hardly had to worry about any of my hidden or obvious eccentricities.
As I moved into my early twenties, a series of events left me struggling to find my footing in medical school. Coincidentally, I was ushered into a 'new world' with a compelling drive to celebrate individual uniqueness. It became widely encouraged not to remain ordinary, and I began to see the world in a different light. Reassuringly, justifiable causes and advocacies were common as the crowd mentality phased out. I particularly loved this period because it destabilised the herd. Individuals who’d lived all their lives in the protection of cliques, or hiding, or nested in whatever chameleon lifestyle that once helped them get by—like me—were told we could find and hone our voices. That it was okay to be a misfit and still shine. That the whole world could revolve around us. ‘Weird is Cool’ became a movement. A pretty big one at that, and with so much potential.
Although ‘potential’ isn’t inherently negative, a great chunk of its value can only be hoped for, as the desired may not be guaranteed. Potential might yield the expected or even a serendipitous outcome, or sadly, an undesirable one despite its enormous capacity for good. While this wave had a laudable intention and the potential impact of helping individuals considered misfits thrive and flourish, it threw those considered basic off balance. This ’stand out’ culture has dominated the scene and is backed up by the current societal ideals. It is deeply rooted in the fabric of our human endeavour—art, science, et cetera.
But if we all stand out, are we truly standing out?
As I inch closer to my third decade, individuality is more precious than ever, with virtually everyone on a quest to outshine, to eke out an X factor, seemingly at all costs. We are expected to be loud with our uniqueness and be defined by it because they are who we ought to be. Thus, there is a precedented proliferation of a little too many cringey exhibitions of awkwardness.
Me?
I’m more confused than ever.
How exceptional is uniqueness these days? Should there be a new definition because ‘unique’ could very well be alien to its name in contemporary usage? If the quality of being unique is absolute, there is some possibility it has ironically lost its savour, becoming what it tried to escape. Subtly, the deviation seems to suggest very little common ground, even though the uniqueness has become our commonality.