On Florida

Matt Isbell, a stellar Florida-based political data analyst, posted this on Twitter for the doubters:

On Sheriffs, Counties and Connecticut

In regards to this October 2023 post from Democracy Docket about the non-necessity of elected sheriffs, I looked up which states abolished the role of sheriff. Turns out that there are few, but notable, examples:

  • Only Alaska and Connecticut lack an office of sheriff
  • Alaska does not have county governments.
  • Connecticut voters moved to abolish the office of sheriff in 2000, replacing the elected office with both state marshals and judicial marshals, which are both non-elected contractors.

But also, who runs the jails if not a sheriff?

Connecticut seems to be far ahead of most states on the question of the relevance of sheriffs, as well as the role of counties, to modern-day government and corrections. This also eliminates the nonsense of “constitutional sheriffs”, and the corruption and feudalism inherent to the office itself.

Imagine such abolition taking place in larger states. How much efficiency would this allow to state government when it comes to zoning, housing, infrastructure, and more?

Georgia Democrats Qualify for a Variety of Seats in a Presidential Year

Qualifying for the May 21 Democratic primary and nonpartisan election ended last Friday at noon.

Statewide:

  • John Barrow is running for Andrew Pinson’s seat on the Supreme Court. This is the first likely-substantial contest against an incumbent justice in years. This “nonpartisan” election is on May 21.
  • There will be a “nonpartisan” contest for an open seat on the State Court of Appeals. Attorney Jeff Davis will face off against Cobb County Magistrate Judge Tabitha Ponder. This “nonpartisan” election is on May 21.
  • The Public Service Commission elections have been cancelled again, and the current commissioners will remain on the ballot for the next two years. It’s likely that we will be voting on all five commissioners in 2026.
  • We are now running for 38 seats (2/3rds) in the Senate and 135 seats (3/4) in the House. To compare, since 1992, we’ve ran for at least 75% of the House in 1992, 1994, 1996 and 2020. 
  • We are running for District Attorney positions in 14 circuits. There will be Republican challengers in three circuits: Atlanta, Chattahoochee and Eastern.
  • Democrats are running for all 14 congressional districts. There will be Republican challengers in all but GA13.
  • At the end of qualifying, we left HD104, a Biden district in Gwinnett County, HD151, a slightly-Trump voting district in Southwest GA, and SD4, a Biden district near Savannah, on the table. 

And now for local elections around Columbus:

  • We will have a Democrat, Carl Sprayberry, for HD139 (open).
  • We will have a Democrat, Ellen Wright, for SD29. 
  • Debbie Buckner in HD137 will have a primary challenge from Carlton Mahone Jr and a Republican challenger. 
  • Teddy Reese in HD140 will have a Democratic challenger in Alyssa Nia Williams. 
  • There will be a Democratic primary for the open seat in deep-red GA03. Val Almonord, who was the Democratic nominee in 2020 and 2022, will have a challenge
  • There will be a Republican challenger for GA02. 
  • We now have a Democrat running for District Attorney in Chattahoochee Circuit: criminal defense attorney Anthony L. Johnson. He has no primary opposition, and will be on the ballot in November against Republican and acting DA Don Kelly. We are also challenging a Republican for DA in Eastern Circuit as well. 
  • Our incumbent Sheriff Greg Countryman is running for re-election as a Democrat. He will be opposed in November by Republican Mark LaJoye.
  • Our incumbent state court solicitor Suzanne Goddard, who previously held office as a Democrat, is running for re-election as a Republican. We have a Democratic challenger in Shevon Sutcliffe Thomas. 
  • Buddy Bryan is running for re-election as Coroner as a Democrat. He will be opposed in the May primary by Royal Anderson. No Republican is running in November. 
  • Lula Lunsford Huff is not running for re-election as Tax Commissioner. David Britt is running as a Democrat for the position and is unopposed in May and November. 
  • We will likely not have a challenger to Gary Allen for Council District 6. A potential candidate fell through. I am sad about this as well since I live here.
  • Toyia Tucker will have a challenge in Council District 4. This “nonpartisan” election is on May 21.
  • There will be a four-way race for Council At-Large 10. This “nonpartisan” election is on May 21.
  • There will be a contest for Board of Education District 7, with Lakeitha Ashe challenging incumbent Pat Frey. This “nonpartisan” election is on May 21.
  • Incumbents unopposed in May and November: Danielle Forte (D) for Superior Court Clerk, Reginald Thompson (D) for Clerk of Municipal Court, Marc D’Antonio (D) for Judge of Probate Court. 
  • No contests for HD138 (Vance Smith (R)), HD141 (Carolyn Hugley (D)), City Council Districts 2, 6 or 8, Board of Education District 1, 3, 5, or At-Large 9, nor State Court Judge (Temesgen). 
  • In addition, there may be some party primary advisory ballot questions. 

Retirements:

  • Both Senate Minority Leader Gloria Butler (SD55) and House Minority Leader James Beverly (HD142) are not running for re-election to either house.
  • Other Senate Democratic retirements: Valencia Seay (SD34) and Horacena Tate (SD38).
  • Other House Democratic retirements: Doug Stoner (HD42), Roger Bruce (HD61), Mandisha Thomas (HD65), Pedro Marin (HD96), Gregg Kennard (HD107), Gloria Frazier (HD126), Patty Bentley (HD150).

Cornell Does It Again: Sonar+AI for eye-tracking

If you remember:

Now, Cornell released another paper on GazeTrak, which uses sonar acoustics with AI to track eye movements.

Our system only needs one speaker and four microphones attached to each side of the glasses. These acoustic sensors capture the formations of the eyeballs and the surrounding areas by emitting encoded inaudible sound towards eyeballs and receiving the reflected signals. These reflected signals are further processed to calculate the echo profiles, which are fed to a customized deep learning pipeline to continuously infer the gaze position. In a user study with 20 participants, GazeTrak achieves an accuracy of 3.6° within the same remounting session and 4.9° across different sessions with a refreshing rate of 83.3 Hz and a power signature of 287.9 mW.

Major drawback, however, as summarized by Mixed News:

Because the shape of the eyeball differs from person to person, the AI model used by GazeTrak has to be trained separately for each user. To commercialize the eye-tracking sonar, enough data would have to be collected to create a universal model.

But still though, Cornell has now come out with research touting sonar+AI as a replacement for camera sensors (visible and infrared) for body, face and now eye tracking. This increases the possibilities of VR and AR which is smaller in size, more efficient in energy and more responsive to privacy. I’m excited for this work.

Video of GazeTrak (eye-tracking)

Video of PoseSonic (upper-body tracking)

Video of EchoSpeech (face-tracking)

Idea: Remote state residency

As more people move out of (or are displaced from) California, maybe the state government should consider a type of state residency which can be exercised from other states.

Idea: a remote state residency.

  • Would allow for the following to apply as remote residents of the state without physical, permanent domicile in the state:
    • Non-residents
    • former residents
    • prospective transplants from other states
    • those born in the state of California 
    • Non-residents who apply for any state volunteer program
    • Non-residents who apply to study remotely or in-person at any California public college or university
  • Would allow for participation in in certain, but not all, activities and services accessible to active California residents
    • Would issue remote residency cards to successful applicants
    • Easier process for applying to CSU, UC and CCC colleges for remote study
    • Easier, discounted process for applying online to California-based public libraries for digital assets
    • Easier process for applying to California Virtual Academies (or state-operated online K-12 public school)
    • Invitations to voluntary programs
      • California State Guard (CSG)
        • Maritime Component
        • Army Component
        • Air Component
      • Programs of the Chief State Officer
        • College Corps
        • California Climate Action Corps
        • Youth Jobs Corps
        • AmeriCorps California
        • Disaster Volunteer Management
        • Alumni Network
    • Automatic application to remote residency for non-residents who volunteer for the above
    • Easier remote company formation, banking, payment processing, and taxation
    • Zoom marriages certified and officiated by California county clerks
  • Must be renewed every five years
  • Why:
    • Many people are driven out of California by the housing crisis
    • Many are hoping to leave other current states of residency due to policy
    • No state services are afforded to those who study online in CSU, UC or CCC systems
    • No state services are afforded to those who work remotely for California-based businesses and organizations
    • Remote work and service is an increasing reality, as is the growing interconnectedness of communications
    • The e-residency programs in Estonia and Lithuania offer a forward-looking attempt to extend the concept of citizenship to those who wish to do business in either country
    • This remote residency program would empower many more people to empower California, and would be an investment in our own future as a state

Thoughts on the Vision Pro, VisionOS and AR/VR

These are my collected thoughts about the Vision Pro, visionOS and at least some of the future of mobile AR as a medium, written in no particular order. I’ve been very interested in this device, how it is being handled by the news media, and how it is broadening and heightening our expectations about augmented reality as those who can afford it apply it in “fringe” venues (i.e., driving, riding a subway, skiing, cooking). I also have thoughts about whether we really need that many optical lenses/sensors, how Maps software could be used in mobile smartglasses AR, and what VRChat-like software could look like in AR. This is disjointed because I’m not having the best time in my life right now.

Initial thoughts

These were mostly written around February 1.

  • The option to use your eyes + pinch gesture to select keys on the virtual keyboard is an interesting way to type out words.
    • But I’ve realized that this should lead, hopefully, to a VR equivalent of swipe-typing on iOS and Android: holding your pinch while you swipe your eyes quickly between the keys before letting go, and letting the software determine what you were trying to type. This can give your eyes even more of a workout than they’re already getting, but it may cut down the time in typing.
    • I also imagine that the mouth tracking in visionOS could allow for the possibility of reading your lips for words without having to “listen”, so long as you are looking at a microphone icon. Or maybe that may require tongue tracking, which is a bit more precise.
  • The choice to have menus pop up to the foreground in front of a window is also distinctive from the QuestOS.
  • The World Wide Web in VR can look far better. This opens an opportunity for reimagining what Web content can look like beyond the WIMP paradigm, because the small text of a web page in desktop view may not cut it.
    • At the very least, a “10-foot interface” for browsing the web in VR should be possible and optional.
  • The weight distribution issue will be interesting to watch unfold as the devices go out to consumers. 360 Rumors sees Apple’s deliberate choice to load the weight on the front as a fatal flaw that the company is too proud to resolve. Might be a business opportunity for third party accessories, however.

Potential future gestures and features

Written February 23.

The Vision Pro’s gestures show an increase in options for computing input beyond pointing a joystick:

  • Eye-tracked gaze to hover over a button
  • Single quick Pinch to trigger a button
  • Multiple quick pinches while hovering over keys in order to type on a virtual keyboard
  • Dwell
  • Sound actions
  • Voice control

There are even more possible options for visionOS 2.0 both within and likely outside the scope of the Vision Pro’s hardware:

  • My ideas
    • Swiping eye-tracking between keys on a keyboard while holding a pinch in order to quickly type
    • Swiping a finger across the other hand while gazing at a video in order to control playback
    • Scrolling a thumb over a finger in order to scroll up or down a page or through a gallery
    • Optional Animoji, Memoji and filters in visionOS Facetime for personas
    • Silent voice commands via face and tongue tracking
  • Other ideas (sourced from this concept, these comments, this video)
    • Changing icon layout on home screen
    • Placing app icons in home screen folders
    • Ipad apps in Home View
    • Notifications in Home View sidebar
    • Gaze at iphone, ipad, Apple Watch, Apple TV and HomePod to unlock and receive notifications
    • Dock for recently closed apps
    • Quick access to control panel
    • Look at hands for Spotlight or Control Center
    • Enable dark mode for ipad apps
    • Resize ipad app windows to create desired workspace
    • Break reminders, reminders to put the headset back on
    • Swappable shortcuts for Action button 
    • User Profiles
    • Unlock and interact with HomeKit devices
    • Optional persistent Siri in space
    • Multiple switchable headset environments 
    • Casting to iphone/ipad/Apple TV via AirPlay
    • (realtime) Translate
    • Face detection
    • Spatial Find My
    • QR code support
    • Apple Pencil support
    • Handwritten notes detection
    • Widget support
    • 3D (360) Apple maps 
    • 3D support
    • Support for iOS/ipadOS keyboard

VRChat in AR?

Written February 23.

  • What will be to augmented reality what VRChat is to VR headsets and Second Life to desktops?
    • Second Life has never been supported by Linden Labs on VR headsets
    • No news or interest from VRChat about a mixed reality mode
    • (Color) Mixed reality is a very early, very open space
    • The software has yet to catch up
    • The methods of AR user input are being fleshed out
    • The user inputs for smartphones and VR headsets have largely settled
    • Very likely that AR headset user input will involve more reading of human gestures, less use of controllers
  • But what could an answer to VRChat or Second Life look like in visionOS or even Quest 3?
    • Issues
      • VRChat (VR headset) and Second Life (desktop) are about full-immersion social interaction in virtual reality
      • Facetime-like video chat with face-scanned users in panels is the current extent
      • Hardware weight, cost, size all limit further social avatars
      • Device can’t be used outside of stationary settings as per warranty and company policy
      • Lots of limitations to VRChat-like applications which involve engagement with meatspace
  • What about VRChat-like app in full-AR smartglasses?
    • Meeting fellow wearers IRL who append filters to themselves which are visible to others
    • Geographic AR layers for landmarks
    • 3D AR guided navigation for maps
    • Casting full personal view to other stationary headset/smartglass users
    • Having other users’ avatars visit you at a location and view the location remotely but semi-autonomously

Google Maps Immersive View

Written back on December 23.

Over a year and a half ago, Google announced Immersive View, a feature of Google Maps which would use AI tools like predictive modeling and NeRF fields to generate 3D images from Street View and aerial images of both exteriors and interiors of locations, as well as generate information and animations about locations from historical and environmental data for forecasts like weather and traffic. Earlier this year, they announced an expansion of Immersive View to routes (by car, bike or walk).

This, IMO, is one of Google’s more worthwhile deployments of AI: applying it to mashup data from other Google Maps features, as well as the library of content built by Google and third-party users of Google Maps, to create more immersive features.

I just wonder when they will apply Immersive View to Google Earth.

Granted, Google Earth already has had 3D models of buildings for a long time, initially with user-generated models in 2009 which were then replaced with autogenerated photogrammetric models starting in 2012. By 2016, 3D models had been generated in Google Earth for locations, including their interiors, in 40 countries, including locations in every U.S. state. So it does seem that Immersive View brings the same types of photogrammetric 3D models of select locations to Google Maps.

The differences between Immersive View and Google Earth seem to be the following:

  • animations of moving cars simulating traffic
  • predictive forecasts of weather, traffic and busyness outward to a month ahead, with accompanying animation, for locations
  • all of the above for plotted routes as well

But I think there is a good use case for the idea of Immersive View in Google Earth. Google touts Immersive View in Maps as “getting the vibe” of a location or route before one takes it. Google Earth, which shares access to Street View with Google Maps, is one of a number of “virtual globe” apps made to give cursory, birds-eye views of the globe (and other planetary bodies). But given the use of feature-rich virtual globe apps in VR headsets like Meta Quest 3 (see: Wooorld VR, AnyWR VR, which both have access to Google Earth and Street View’s data), I am pretty sure that there is a niche overlap of users who want to “slow-view” Street View locations and routes for virtual tourism purposes without leaving their house, especially using a VR headset.

But an “Immersive View” for Google Earth and associated third-party apps may have go in a different direction than Immersive View in Maps.

The AI-driven Immersive View can easily fit into Google Earth as a tool, smoothing over more of the limitations of virtual globes as a virtual tourism and adding more interactivity to Street View.

Sonar+AI in AR/VR?

Written around February 17.

Now if someone can try hand-tracking, or maybe even eye-tracking, using sonar. The Vision Pro’s 12 cameras (out of 23 total sensors) need at least some replacement with smaller analogues:

  • Two main cameras for video and photo
  • Four downward, 2 TrueDepth and 2 sideways world-facing tracking cameras for detecting your environment in stereoscopic 3D
  • four Infrared internal tracking cameras that track every movement your eyes make, as well as an undetermined number of infrared cameras outside to see despite lighting conditions
  • LiDAR
  • Ambient light sensor
  • 2 infrared illuminators
  • Accelerometer & Gyroscope

Out of these, perhaps the stereoscopic cameras are the best candidates for replacement with sonar components.

I can see hand-tracking, body-tracking and playspace boundary tracking being made possible with the Sonar+AI combination.

Links of interest 2/26/24