16 comments

  • MarkusWandel3 days ago
    I always say "on a scale from no canoe to a $5K canoe, even the crappiest canoe is 80% of the way there". This camera illustrates that for vision. When you hear about those visual implants that give you, say, 16x16 grayscale you think that's nothing. Yet 30x30 grayscale as seen in this video, especially with live updates and not just a still frame is... vision. Not 80% of the way there, but does punch way above its weight class in terms of usefulness.
    • SwtCyber6 hours ago
      The moment you add motion and temporal continuity, even a postage-stamp image turns into something your brain can work with
      • dehrmann13 minutes ago
        In some situations, you can trade resolution for frequency and maintain quality. 1-bit audio is a thing:<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Direct_Stream_Digital" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Direct_Stream_Digital</a>
      • MarkusWandel5 hours ago
        The brain really is quite a machine. I&#x27;ve personally had a retinal tear lasered. It&#x27;s well within my peripheral vision, and the lasering of course did more damage (but prevents it from spreading). How much of this can I <i>see</i>? Nothing! My peripheral vision appears continuous. Probably I&#x27;d miss a motion event only visible to that eye only in that particular zone. Not to mention the enormous number of &quot;floaters&quot; one gets especially by my age (58). Sometimes you see them but for the most part the brain just filters them out.<p>Where this becomes relevant is when you consider depixellation. True blur can&#x27;t be undone, but pixellation without appropriate antialiasing filtering...<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=acKYYwcxpGk" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=acKYYwcxpGk</a><p>So if your 30x30 camera has <i>sharp</i> square pixels with no antialiasing filter in front of the sensor, I&#x27;ll bet the brain would soon learn to &quot;run that depixellation algorithm&quot; and just by natural motion of the camera, learn to recognize finer detail. Of course that still means training the brain to recognize 900 electrodes, which is beyond the current state of the art (but 16x16 pixels aren&#x27;t and the same principle can apply there).
        • jacquesm3 hours ago
          It would be interesting to see how far you could push that. I bet just two scanlines side-by-side would be enough for complete imaging. Maybe even just one, but that would require a lot more pre-processing and much finer control over the angle of movement. Congrats on the positive outcome of that surgery, that must have been pretty scary.
    • lillecarl14 hours ago
      Diminishing returns explained through canoes :)<p>16x16 sounds really shit for me who still has perfect vision indeed but I bet it&#x27;s life changing to be able to identify presence &#x2F; absence of stuff around you and such! Yay for technology!
      • ACCount3712 hours ago
        This kind of thing is really held back by BCI tech.<p>By now, we have smartphones with camera systems that beat human eyes, and SoCs powerful enough to perform whatever image processing you want them to, in real time.<p>But our best neural interfaces have the throughput close to that of a dial-up modem, and questionable longevity. Other technological blockers advanced in leaps and bounds, but SOTA on BCI today is not that far away from 20 years ago. Because medicine is where innovation goes to die.<p>It&#x27;s why I&#x27;m excited for the new generation of BCIs like Neuralink. For now, they&#x27;re mostly replicating the old capabilities, but with better fundamentals. But once the fundamentals - interface longevity, ease of installation, ease of adaptation - are there? We might actually get more capable, more scalable BCIs.
        • SiempreViernes11 hours ago
          &gt; Because medicine is where &quot;move fast and break things&quot; means people immediately die.<p>Fixed the typo for you.
          • ACCount3710 hours ago
            Not moving fast and not breaking things means people die slowly and excruciatingly. Because the solutions for their medical issues were not developed in time.<p>Inaction has a price, you know.
            • omnicognate9 hours ago
              It has a price for the person with the condition. For the person developing the cure it does not (except perhaps opportunity cost, money not made that could have been), whereas killing their patients can have an extremely high one.
            • jama2114 hours ago
              You’re starting to sound terrifyingly like an unethical scientist. We know how that ends, we’ve been down that road before, and we know why it is a terrible idea.
              • rogerrogerr1 hour ago
                There is a lot of space between “persons with a debilitating condition are prohibited from choosing to take a risky treatment that might help” and “hey let’s feed these prisoners smallpox-laced cheese for a week and see what happens”.<p>The “no harm, ever” crowd does not have a monopoly on ethics.
        • arcanemachiner12 hours ago
          To anyone wondering:<p>BCI == Brain-computer interface<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Brain–computer_interface" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Brain–computer_interface</a>
          • Lapsa12 hours ago
            mind reading technology has already arrived. radiomyography &amp; neural networks deciphering EEGs
            • ACCount3711 hours ago
              Not really. Non-invasive interfaces don&#x27;t have the resolution. Can&#x27;t make an omelet without cracking open a few skulls.
              • Lapsa8 hours ago
                they do read my mind at least to some extent -&gt; &quot;The paper concludes that it is possible to detect changes in the thickness and the properties of the muscle solely by evaluating the reflection coefficient of an antenna structure.&quot; <a href="https:&#x2F;&#x2F;ieeexplore.ieee.org&#x2F;document&#x2F;6711930" rel="nofollow">https:&#x2F;&#x2F;ieeexplore.ieee.org&#x2F;document&#x2F;6711930</a>
      • metalman12 hours ago
        it is a good ilustration of something like moores law, for a comming end point where a hand held device will have more than enough cabability and capacity to do ANYTHING, a meer mortal will EVER require, and the death of options and features, and a return to personal autonomy and responsibility<p>AI is the final failure of &quot;intermitent&quot; wipers,which like my latest car, is irevocably enabled to smeer the road grime and imperceptable &quot;rain&quot; into a goo, blocking by ability to see
        • makeitdouble10 hours ago
          True. Then we cross a threshold where things that weren&#x27;t even thought as possible become reachable, and we&#x27;re back on the treadmill.<p>That&#x27;s what we&#x27;re having with VR: we came to a point where increasing DPI for laptop or phone seemed to make no sense; but that was also the point where VR started to be reachable, and over there a 300DPI screen is crude and we&#x27;d really want 3x that pixel density.
        • immibis11 hours ago
          use the washer button to spray the windshield with water and help the goo wipe off
          • metalman5 hours ago
            yes, obviously, but my point is that I am now tasked with helping the &quot;feature&quot; limp along, whenever it lurches, unexpectedly, into action, therby ADDING distraction which if you read the ancient myths and legends is one of the main methods that evil spirits and deamons undermine and defeat the unwary....and lull them into becoming possesed, hosts, for said entities.<p>who&#x27;s working for who here anyway?<p>already?
  • mdtrooper11 hours ago
    These kind of news are for me the real news for this website instead of a new fancy tech product of Apple or similar corporation.<p>Sincerely a lot of thanks.
    • bbeonx1 hour ago
      agreed...i think it&#x27;s fine to keep up with what the corporate world is doing, but these projects bring me real joy
    • SwtCyber6 hours ago
      Corporate launches are predictable and polished; projects like this are the opposite
  • anotherpaul14 hours ago
    The original post: <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;electronics&#x2F;comments&#x2F;1olyu7r&#x2F;i_made_a_camera_from_an_optical_mouse_30x30&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;electronics&#x2F;comments&#x2F;1olyu7r&#x2F;i_made...</a><p>The video on Reddit: <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;3Dprinting&#x2F;comments&#x2F;1olyzn6&#x2F;i_made_a_camera_from_an_optical_mouse_30x30&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;3Dprinting&#x2F;comments&#x2F;1olyzn6&#x2F;i_made_...</a>
    • zamadatix9 hours ago
      One of the comments from the creator answered one of my questions <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;3Dprinting&#x2F;comments&#x2F;1olyzn6&#x2F;comment&#x2F;nmloqvg&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;3Dprinting&#x2F;comments&#x2F;1olyzn6&#x2F;comment...</a>:<p>&gt; Do you mean the refresh rate should be higher? There&#x27;s two things limiting that: &gt; - The sensor isn&#x27;t optimized for actually reading out images, normally it just does internal processing and spits out motion data (which is at high speed). You can only read images at about 90Hz &gt; - Writing to the screen is slow because it doesn&#x27;t support super high clock speeds. Drawing a 3x scale image (90x90 pixels) plus reading from the sensor, I can get about 20Hz, and a 1x scale image (30x30 pixels) I can get 50Hz.<p>I figured there would be limitations around the second, but I was hoping the former wasn&#x27;t such a big limit.
  • consumer4518 hours ago
    Direct link to the reddit post:<p><a href="https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;electronics&#x2F;comments&#x2F;1olyu7r&#x2F;i_made_a_camera_from_an_optical_mouse_30x30&#x2F;" rel="nofollow">https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;electronics&#x2F;comments&#x2F;1olyu7r&#x2F;i_made...</a>
  • gsf_emergency_613 hours ago
    Compressed sensing! What Terence Tao uses to sell math funding!!<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=EE9AETSoPHw&amp;t=44" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=EE9AETSoPHw&amp;t=44</a><p><a href="https:&#x2F;&#x2F;www.instructables.com&#x2F;Single-Pixel-Camera-Using-an-LED-Matrix&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.instructables.com&#x2F;Single-Pixel-Camera-Using-an-L...</a><p>(Okay not the same guy, but I wanted to share this somewhat related &quot;extreme&quot; camera project)
    • fph11 hours ago
      Is this compressed sensing though? The description says &quot;Sensor 30x30 pixels, 64 colors (ADNS-3090 if you wanna look it up)&quot;, so definitely not a single-pixel camera.
      • gsf_emergency_69 hours ago
        Sorry to get you confused. This is a different setup. TFA uses the 30x30 and no compressed sensing. The link above uses a single photo detector. They also use an LED matrix, but that&#x27;s to make the _image_ (I think)
    • HPsquared10 hours ago
      I wonder how much quality you could get out of that sensor.
  • kachapopopow2 hours ago
    Waiting until someone builds a high speed camera using mouse sensors.
  • shit_game9 hours ago
    Very cool project. I love the detail the poster went into in their linked video post about working with the sensor and their implementation.<p>&gt; Optical computer mice work by detecting movement with a photoelectric cell (or sensor) and a light. The light is emitted downward, striking a desk or mousepad, and then reflecting to the sensor. The sensor has a lens to help direct the reflected light, enabling the mouse to convert precise physical movement into an input for the computer’s on-screen cursor. The way the reflected changes in response to movement is translated into cursor movement values.<p>I can&#x27;t tell if this grammatical error is a result of nonchalant editing and a lack of proofreading or a person touching-up LLM content.<p>&gt; It’s a clever solution for a fundamental computer problem: how to control the cursor. For most computer users, that’s fine, and they can happily use their mouse and go about their day. But when Dycus came across a PCB from an old optical mouse, which they had saved because they knew it was possible to read images from an optical mouse sensor, the itch to build a mouse-based camera was too much to ignore.<p>Ah, it&#x27;s an LLM. Dogshit grifter article. Honestly, the HN link should be changed to the reddit post.
    • foxglacier1 hour ago
      LLM or not doesn&#x27;t matter as much as it&#x27;s just bad reader-hostile writing with a dump of trivial details while also glossing over the relevant part (how does a mouse detect movement).
  • jacquesm3 hours ago
    These are &#x27;optical flow&#x27; chips. They are quite interesting for many reasons.
  • supportengineer5 hours ago
    This is fantastic. What an amazing project! There is a certain segment of photography enthusiasts who love the aesthetic.
  • JKCalhoun8 hours ago
    &quot;I made a camera from an optical mouse. 30x30 pixels in 64 glorious shades of gray!&quot;<p>I wonder why so many shades of grey? Fancy!<p>(Yeah, the U.K. spelling of &quot;grey&quot; looks more &quot;gray&quot; to these American eyes.)<p>Hilarious too that this article is on Petapixel. (Centipixel?)
  • jan_Sate9 hours ago
    Impressive. That&#x27;s what I read HN for!
  • foxglacier13 hours ago
    I have to say, the Game Boy camera doesn&#x27;t have only 4 colors. It has an analog output you can connect to your own ADC with more bits and get more shades of gray. I even managed to get color pictures out of it by swapping color filters and combining the images.
  • eugene33069 hours ago
    Just in case the author is here: what&#x27;s the FPS?
    • madars6 hours ago
      On Reddit, author says &quot;The preview is shown at 20fps for a 3x scale image (90x90 pixels) and 50fps for a 1x scale image. This is due to the time it takes to read the image data from the sensor (~10ms) and the max write speed of the display.&quot;, and adds that optical mice motion tracking goes to 6400 fps for this sensor but you can&#x27;t actually transmit image at that rate.<p><a href="https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;electronics&#x2F;comments&#x2F;1olyu7r&#x2F;i_made_a_camera_from_an_optical_mouse_30x30&#x2F;nmma45z&#x2F;" rel="nofollow">https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;electronics&#x2F;comments&#x2F;1olyu7r&#x2F;i_made...</a>
  • SwtCyber6 hours ago
    What I love most is that it takes something we all interact with every day
  • ck23 hours ago
    vaguely related but exponentially more impressive<p>camera the size of a grain of rice with 320x320 resolution<p><a href="https:&#x2F;&#x2F;ams-osram.com&#x2F;products&#x2F;sensor-solutions&#x2F;cmos-image-sensors&#x2F;ams-naneyec-integrated-camera-module" rel="nofollow">https:&#x2F;&#x2F;ams-osram.com&#x2F;products&#x2F;sensor-solutions&#x2F;cmos-image-s...</a>
    • buildbot3 hours ago
      Woah, “wafer level optics” sounds really fancy<p><a href="https:&#x2F;&#x2F;www.mouser.com&#x2F;datasheet&#x2F;3&#x2F;5912&#x2F;1&#x2F;NanEyeC_DS000503_5_00.pdf" rel="nofollow">https:&#x2F;&#x2F;www.mouser.com&#x2F;datasheet&#x2F;3&#x2F;5912&#x2F;1&#x2F;NanEyeC_DS000503_5...</a>
  • arschficknigger3 hours ago
    [flagged]