Home » Posts tagged 'Semiconductor memory'
Tag Archives: Semiconductor memory
How more deeply embedded is a visual memory if you crafted the drawing or painting that is the catalyst for its recall.
Fig.1. Hockney on iPad. Now here’s a lifelog to treasure. How does the master i-paint?
If a moment is to be captured, maybe David Hockney has the answer with a iBrush painting on an iPad. This is closer to the truth of a moment, seen through the artist’s being, their psychological and physiological approach to what they both see and perceive in front of them.
When we do we form a real memory of the actions required to undertake a task. We build on our initial attempts. The memory to ski, to dance, to swim, to skip, to ride a bicycle, to write, to draw, to pay a musical instrument – these cannot be caught by a complex collection of digital recording devices. Perhaps if the player wore a total bodysuit as actors do to play CGI generated character then we’d have a record of the memory of this experience. It wouldn’t a digital memory make – just a record.
The Semantic Web aims to standardize transmission and translation of information, is an important effort in this area. (Bell and Gemmel, 2009 p. 220 )
Is it really necessary, possible or desirable to take the moronic qualities of sports coverage and impose it on a person going about their every day and far less eventful day. This is the premise for a comedy sketch. (Bell and Gemmel, 2009. p. 224)
Fig. 2. How we learn. Conceptualised in SimpleMinds while taking the Open University Masters in Open and Distance Education.
All praise to the blogger Mark Stewart but does such a record need to be entertaining to gain validity and so a place in Digital Lives at the British Library. (Bell and Gemmel, 2009. p. 225)
Though remembering and talking through a difficult time can also offer its solution. However, these is a difference between hoarding the memory or keeping it to yourself, and letting it go. The wrong memories permanently on recall will be multiple albatrosses.
Take WW1 veterans, or any war veterans, some want to tell, some went to bury. Why should people in the future feel obliged to record it all, to have more than enough to bring it back?
As written language is such a recent phenomenon it is perhaps not the best or even the most natural way to remember.
Going back we had the drawn image and the spoken word, but never the absolute of a panoramic or 360 digital picture, but rather moment filtered through the mind and expressed at the fingertips as a painting or drawing.
Fig. 3. Web 1.0 invigorated by Web 2.0 into a water-cycle of, appropriately, clouds.
Weather systems and water courses, urban run–off and transpiration.
Here the flotsam and jetsam of web content, the good, the bad, the ugly and ridiculous, the massive, the moving, the invaluable as well as the ephemera, is agitated, mixed–up and circulated, viewed, pinched, reborn, mashed up, bashed up or left to atrophy – but it is in the mix and open for business. Find it and the thought, or image, or sound bite, the message, the idea is yours to dwell upon and utilise. It is education, revelation and knowledge of a new order. Expect more of the same and the unexpected.
Related articles
- I’ve long visualised digitization as creating an ocean of content. With Web 2.0 this ocean developed currents, weather systems and a water-cycle. (mymindbursts.com)
- Digital content, like its liquid equivalent in a digital ocean, has an extraordinary ability to leak out. (mymindbursts.com)
- The greatest value of extending our capacity to remember, but externally and internally will be to take a record and build on it, treat it is as living thing that grows into something more. (mymindbursts.com)
- The idea of a machine that acts as a perfect memory prosthesis to humans is not new. (mymindbursts.com)
- The power to remember and the need to forget (mymindbursts.com)
- Automatically captured autobiographical metadata (mymindbursts.com)
- I use dreams to dwell on a topic. (mymindbursts.com)
- The diffusion and use of innovations is complex – like people. (mymindbursts.com)
- Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People (mymindbursts.com)
- Grimms Fairy Tales – visualization to empower or distort meaning and memory. (mymindbursts.com)
Digital content, like its liquid equivalent in a digital ocean, has an extraordinary ability to leak out.
Fig.1. Gordon Bell, ready for action – lifelogging for a decade
The biggest problem with lifelogging as it is conceived of by Gordon Bell (2009) is that the camera points away from the protagonist rather than at them.
Far better the record of the person’s facial expressions as they go about their daily business as an indication of what is going on their minds – which is otherwise impossible to suggest unless a running commentary is offered. Though of course, the contribution of the running commentary, let alone the wearing of the device and its being on changes the record. This cannot therefore be an objective documentary record, as if a zoological research study. And then, what do you legally do with images you get not just outside, but inside the someone’s house.
This content is implicitly for private and singular consumption only, but it would pick up images that others could use in illicit ways.
Fig. 2. The Point, Beadnell. A memory forever for my encounters with nature on this stick of rock pointing into the North Sea.
Digital content, like its liquid equivalent in a digital ocean, has an extraordinary ability to leak out.
I don’t believe Bell’s attitudes regarding privacy are headed for extinction, but some people will choose to keep as much as possible private while others will go to great lengths to expose and disclose everything – in both situations there is for better and for worse. (Bell and Gemmel, 2009 p. 213)
If 10,000 asthmatics revealed their health related lifelog in real time how soon would researchers be able to act on this? If alcoholics wore a lifelog would their drinking stop and certainly drink-driving be over forever? What a field day psychologists would have and what they would learn about all kinds of things such as depression, bipolar or ADHD.
Bell introduces us to a Speechome where a couple have turned their house in the set of the TV show Big Brother, with cameras everywhere. (Bell and Gemmel 2006. p. 114)
Their son hasn’t had a choice – there is a ‘total record’ of his development over this period. Is it right to use your own child in this way? And can a record such as this be called a ‘corpus’ ? It isn’t a scientific study, just a CCTV record. This is where Bell’s language is, throughout, skewed in favour of the system and methodologies he is expounding. He would do far greater justice to his actions if his record where the subject of academic study, the publication of peer review and therefore the release to academics of the record he has kept. Someone will volunteer this if he won’t.
Part of our era is the sharing and connectivity of information and the way it is transformed through collective experience and comment … even trailblasing many others to do the same.
Fig. 3 Stephen Gough the bloke who refused to put any clothes on – anywhere, ever. A form of obsession.
There is a character from Scotland who insists on living his life naked.
He is consequently arrested repeatedly. It strikes me, I’m afraid that Gordon Bell might be evangelical about being naked … but will keep his clothes on. Like an omnivore selling the virtues of veganism, while eating everything under the sun. Or will Bells 10/15 year lifelog be released to researchers on his death?
‘Most of us are well along the path to outsourcing our brains to some form or e-memory’. Bell says (2009. p 119).
Should we scrutinise this for some scientific value? ‘Most of us …’ meaning?
From a study of 1000, or 2000 people.
Who, where do they live, what is their educational background?
Their access to digital kit and networks? Are they representative of the 6 billion on the planet, or just a community of Silicon Valley Computer engineers? ‘Most of us … ‘ implies that this could be the self-selecting readership of the book. Who would read it if they could empathise? ‘Well along the path’ implies that already there is a groundswell, a desired adoption of these kinds of technologies.
On what basis is this to be believed?
Are there are number of ‘diffusion of innovation’ studies current in order to measure this? What is the benchmark? What are the parameters of the path?
‘Our brains’ – by what definition either ‘ours’ or even ‘brains’.
A living organ cannot be outsourced can it? This isn’t like making a donation to a sperm bank. There is no means to store any component of our brains nor has anything more that a gallery of images or a storage space for documents yet been developed. There is no electronic memory. Even if you want to call a relational database on a hard drive an e-memory it cannot be – no amount of juggling the electronic pack of cards will turn an audio file, a still image or video into the memory. Indeed, the only possible association with a memory is when someone looks at them and a memory forms in their mind – and what is more, anyone at all, looking at or hearing or viewing these records will also form memories. i.e. they are the enablers of memory recall, or thought creation, they are a catalyst, but they can never be the memory.
Related articles
- Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People (mymindbursts.com)
- My Self-Portrait. Now I get Van Gogh. (grandmaeileensvintage.wordpress.com)
- Can Lifelogging Devices Augment Our Memories? (techonomy.com)
- Lifelogging and Self Quantization : the good and the bad (gelnior.wordpress.com)
- “Skate where the puck’s going, not where it’s been.” (mymindbursts.com)
- The reality is that our digital world long ago washed over the concept of a e-memory. (mymindbursts.com)
- The memory is the mind process happening in your brain, it can never be the artefact that plays back footage of an experience. (mymindbursts.com)
The idea of a ‘world brain’ that acts as a perfect memory prosthesis to humans is not new.
Fig. 1 H G Wells
In the late 1930s, British science fiction writer H. G. Wells wrote about a “world brain” through which “the whole human memory can be [ . . . ] made accessible to every individual.” Mayer-Schönberger (2011. p. 51)
I think to keep a lifelog is to invite sharing. It’s so Web 2.0.
It may be extreme, but some will do it, just as people keep a blog, or post of a picture taken every day for a year or more. The value and fears of such ‘exposure’ on the web have been discussed since the outset. There are new ways of doing things, new degrees of intimacy.
‘Obliterating the traditional distinction between information seekers and information providers, between readers and authors, has been a much-discussed quality since the early days of the Internet’. Mayer-Schönberger (2011. p. 83)
‘By using digital memory, our thoughts, emotions, and experiences may not be lost once we pass away but remain to be used by posterity. Through them we live on, and escape being forgotten’. Mayer-Schönberger (2011. p. 91)
At a faculty level I have twice created blogs for the recently deceased.
Fig. Jack Wilson MM 1938
It was with greater sadness that I did so with my own parents with my father in 2001 and my mother in 2012. While, by recording interviews with my late grandfather I moved close to the conception of a digital expression of a person. It doesn’t take much to imagine a life substantially ‘lifelogged’ and made available in various forms – a great tutor who continues to teach, a move loved grandparent or partner to whom you may still turn …
Source: wikihow.com via Peter on Pinterest
Fig. 3. Bell and Gemmel imagines lifelogs of thousands of patients used to in epidemiological survey. (Bell and Gemmell, 2009. p. 111)
This has legs. It ties in with a need. It related to technologies being used to managed patients with chronic illnesses. It ties in to the training of clinicians too.
Related articles
- The power to remember and the need to forget (mymindbursts.com)
- Automatically assisting human memory: (mymindbursts.com)
- Memoto Lifelogging Camera (gadgetizeme.wordpress.com)
- The memory is the mind process happening in your brain, it can never be the artefact that plays back footage of an experience. (mymindbursts.com)
- Bianca Bosker: In Defense Of Forgetting To Forget (huffingtonpost.com)
- The idea of gathering a substantial part of one’s life experience fascinates me, as it has often inspired others (mymindbursts.com)
Dreams of technology enhanced learning as a micro-chipped jelly-fish in a digital ocean
Fig. 1. I visualised the biological and digitised memory as a huge, translucent jellyfish.
It is a deliberate exercise to fall asleep with a book or eBook in my hands and in my head. This may even be a mid-evening exercise lasting between 20 and 45 minutes. It works if I remember what goes on and then write it down. There’s no way this can be digitally grabbed.
In the skin of the jellyfish there is a microchip. This microchip represents the digital record, the stamp like artefact that is a snap of time, a set of images, a sound-bite, a record of the creatures physiological outputs. The rest of the creature is what isn’t capture – the tendrils of the jellyfish the synapses that connect the bulk of the creature memories that defy definition.
It isn’t a digital memory. The visual or sensory capture is a fraction of what forms the memory.
Not is memory static. It is forming and reforming, diminishing, refreshing and fracturing all the time. To grab a ‘memory’ is to capture and box a set of impossibly complex electro-chemical reactions. It is multi-dimentional too – when I see the Royal Cinema, I feel a sticky ice-lolly stick on my neck, I smell the fusty, cigarette-smoke embedded chairs and here the announcement of the serial. And what I see is filtered through my mind’s eye, not a lense.
I don’t see it in high definition.
There are three kinds of memory: (Bell and Gemmell, 2009. p. 53)
- Procedural (muscle memory)
- Semantic (facts that you know that aren’t rooted in time and place)
- Episodic (autobiographical)
Perhaps if I am going to wear a gadget around my neck it should record something I cannot see or sense?
At a different wavelength or spectrum. i.e. telling me something I don’t know.
‘Biological memory is subjective, patchy, emotion-tinged, ego-filtered, impressionistic, and mutable. Digital memory is objective, disappassionate, prosaic, and unforgivingly accurate’. (Bell and Gemmel, 2009. p. 56)
The first is a memory the second is not. Bell appears to think that ‘memory’ is an artefact that is capable of a digital record – it is not. Nothing that MyLifeBits has done is a record of a memory – it is simply stuff digitised. For something to be called a ‘digital memory’ then it will need to have the attributes of its biological and analogue form.
A memory is a product of our lapses and distractions.
It matters that we daydream, as well as focus. It matters that what we see or experience once needs to be experienced a second, third and fourth time so that meaning aggregates and our minds adapt. What Bell is describing is a massive, relentless, comprehensive attempt at keeping a diary. Would it not be more useful to hire a personal assitance? If you have the wealth to support it have, like Winston Churchill, a secretary at hand to take dictation? This has to be a close proximity to the record of a memory as it is formed?
With the written word came libraries.
‘If you have ever tried reading an old diary entry of yours from many years ago, you may have felt this strange mixture of familiarity and foreigness, of sensing that you remember some, perhaps most, but never all of the text’s original meaning’. (Mayer-Schönberger, 2009. p. 33)
On Gordon Bell – his goal is nothing short of obliterating forgetting. (Mayer-Schönberger, 2009. p. 50)
I wonder if the quest to make an Artificial Intelligence like the human mind will be a more fruitful one that trying to turn a human mind into a digital one? That adding AI attributes to a database will achieve more, than by thinking of the human as nothing more than a bipedal device from which to record external goings on?
Related articles
- The memory is the mind process happening in your brain, it can never be the artefact that plays back footage of an experience. (mymindbursts.com)
- The idea of gathering a substantial part of one’s life experience fascinates me, as it has often inspired others (mymindbursts.com)
- The power to remember and the need to forget (mymindbursts.com)
- Automatically Augmenting Lifelog Events Using Pervasively Generated Content from Millions of People (mymindbursts.com)
- Automatically captured autobiographical metadata (mymindbursts.com)
- Going, Going, Gone: the Where and Why of Memory Erasure (theepochtimes.com)
- Saving the digital decade: DPC rewards organizations helping to safeguard our digital memory (girlinthearchive.wordpress.com)
- Digital content, like its liquid equivalent in a digital ocean, has an extraordinary ability to leak out. (mymindbursts.com)
- The idea of a machine that acts as a perfect memory prosthesis to humans is not new. (mymindbursts.com)
- I use dreams to dwell on a topic. (mymindbursts.com)