NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
A macOS terminal command that tells you if your USB-C cable is bad (kau.sh)
_carbyau_ 10 minutes ago [-]
Is there a reason we can't plug a usb c cable with BOTH ends into the same computer and then get a full diagnostic on just the cable itself?
Izkata 21 hours ago [-]
It's more complicated than "this cable is good/bad". I had a suspicion about one of my cables for months, but just last week I confirmed it with a device that shows the volts/amps/watts/total kwh passing through it: I have a USB-C cable with orientation. Plugged in one way it transfers about 5x more power than the other way, and in the lower orientation it's low enough for the earbud case I use it with to not charge.
brailsafe 58 minutes ago [-]
My pixel 7 seems to have fully died out of the blue while charging two days ago, using a USB-C I thought might be getting a little flaky (connected to my mac, I'd occasionally get repeating disconnects). I wonder if something along these lines could be the culprit.

I picked it up to find it had shut itself off, and now won't accept any charge, wireless or wired from any combination of power sources and cables. No signs of life at all.

wiradikusuma 18 hours ago [-]
Could you elaborate on "orientation"?

Let's say for C-to-C, are you talking about swapping the head/tail? Or simply connecting at a different angle (180 degrees)?

rcxdude 18 hours ago [-]
Probably 180 degrees rotation in the plug (on either end). It commonly happens if one of the contacts or conductors for USB-PD signalling is not working correctly. (because of the way the pinout is designed to work either way around, the conductors that are used for signalling swap roles depending on the orientation)
Izkata 13 hours ago [-]
Yep, 180 degree rotation.
18 hours ago [-]
13 hours ago [-]
giancarlostoro 12 hours ago [-]
Thats so weird, did you wind up coloring one end or something? I still wish we would add color to USB C wires like USB 3 has to emphasize features and expected uses. USB C was a much needed change from USB3 and 2 in terms of being reversible and superior but every manufacturer implements the cables differently and its confusing and hard to figure out which cable is best for what.
opan 14 minutes ago [-]
Some cables write 10Gbps and similar near the end.
lostlogin 17 hours ago [-]
The audio community love this sort of thing and will pay top dollar for unidirectional cables. Reproducible data proving the claims could be worth millions.
ChrisGreenHeur 17 hours ago [-]
well, if you listen to audio you would not want the audio to accidentally get confused and head back to where it came from halfway down the cable right?
13 hours ago [-]
kstrauser 11 hours ago [-]
“This cut signal reflections, yielding brighter high hats without the brassiness of two-directional cabling. Bass was particularly clear and rumbly without the muddiness we heard from Monoprice cords.”
duttish 20 hours ago [-]
Wait what. I thought half the point of usb c was to not rely on orientation.

Is there any way to check this other than experiment?

My "solution" so far has been to not buy cheap cables and just hope I get quality in return.

lmm 19 hours ago [-]
> I thought half the point of usb c was to not rely on orientation. > Is there any way to check this other than experiment?

Well sure, a standards-compliant cable will work in either orientation, but it's always possible for some but not all of the pins or wires to break.

pja 18 hours ago [-]
I believe USB C cables actually do have an orientation - it's just that the negotiation both ends do usually makes that orientation irrelevant.

Maybe the negotiation can fail & the plugged in orientation is then the only one that works?

estimator7292 12 hours ago [-]
USB-C only has an "intrinsic" orientation because we call one set of pins "1" and the other "2". Electrically there should be no difference.
lxgr 9 hours ago [-]
No, there really is an intrinsic orientation, at least once a cable is plugged in.

The receptacles are symmetric, but a full connection is not. The cable only connects CC through end-to-end on one of A5 or B5, but not both, which lets the DFP detect which of A5 or B5 should be CC. The one not used for CC is then used to power the e-marker in the cable, if any.

This is also true for legacy adapters; for example, for C-to-A USB 3 adapters, the host needs to know which of the two high-speed pairs to use (as USB 3 over A/B connectors only supports one lane, while C supports two).

tennysont 20 hours ago [-]
I think that I have a specific cable-device-orientation that is broken. Meaning, I think a particular USB C cable won't charge my phone if it's plugged in 'backwards'.

I always assumed that USB C cables use different pins depending on orientation, and that some pins on the cable wore down.

Maybe that's what happened here?

consp 19 hours ago [-]
My guess would be they used a one-sided pcb to connect the cable to and used half the wires. Some sockets internally link the power and ground pins, so it works both ways, but you get no resistor network and thus only standard 5v which gives you 500ma max (at best). With the resistors connected by the cable it's about 900ma to 3a which is probably what happens plugged in "correctly". Or some other magic happens on one side of the PCB to fool the charger into pushing the full 3A.
lxgr 18 hours ago [-]
Shouldn't a compliant USB-C DFP not supply Vbus without the resistor network, though, so there should be no charging at all? (Not that all DFPs necessarily do the correct thing, of course.)
SAI_Peregrinus 12 hours ago [-]
Correct, which is probably why it won't even charge their earbuds in the broken orientation.
10 hours ago [-]
Waterluvian 20 hours ago [-]
I think a more distressing thought is that it’s quite possible that your cable won’t charge your phone if it’s plugged in forwards.
numpad0 18 hours ago [-]
It's CC2/VCONN used for eMarker. That pin may be terminated inside the cable and used to power eMarker chip. It can also be used for orientation sensing. I think.
20 hours ago [-]
cyberax 19 hours ago [-]
It happens. More often than not, it can be physical damage or manufacturing defect for one of the contacts and/or wires.
atoav 20 hours ago [-]
It is not unheard of to have single damaged lines/connector-joints within a cable. The question is whether your cable was designed that way or whether it was damaged in a way that made it do this.
Perz1val 17 hours ago [-]
It won't be a damaged wire, there's only one set of those, it's the plug lacking connectors or having them not connected
dist-epoch 17 hours ago [-]
I can confirm, I have a USB-C cable with the same problem, charging speed depends on the orientation of the USB-C connector, which is hilarious.

It was not a cheap cable, it was a medium-priced one with good reviews from a known brand.

mnw21cam 17 hours ago [-]
No, I don't get it. Firstly, the normal system command output is not hard to read, but secondly, this output doesn't list any of the capabilities of the cables, just the devices at the ends of them. Perhaps showing an example of the output when the device is plugged in through the wrong cable would have helped. Does the tool produce a similar warning to the system popup, that is "this cable and device are mismatched"?
lxgr 9 hours ago [-]
As far as I understand, the idea is to determine whether the cable is the bottleneck from a hardcoded list of theoretical device capabilities with actually observed connection speeds as reported by the OS.

It would be nice to just compare with the device's reported maximum capability, but I'm not sure whether macOS exposes that in any API.

bapak 17 hours ago [-]
Fun fact: this information is already available in the System Information app on your Mac.

Hardware -> USB

I also use the app to check what wattage my cables are when charging my MacBook (Hardware -> Power)

lxgr 9 hours ago [-]
This only shows you the minimum of what the cable and adapter support together, though. I believe this is a fundamental limitation of the protocol; the source won't tell you about voltage/current combinations not supportable by the cable.
angulardragon03 14 hours ago [-]
Yes, system_profiler is just a terminal version of System Information.
BrandoElFollito 17 hours ago [-]
I was looking for a USB cable tester (where I would plug in both ends of my cable and it would test it (power, data, ...).

There are plenty for Ethernet, but none such ones for USB. Was I looking with the wrong keywords or such device does not exist?

Note: I have a dongle that measures the power when inserted between the laptop and the charger, this is not what I am looking for

bariumbitmap 17 hours ago [-]
I have the Treedix cable tester, it works well.

https://treedix.com/collections/best-seller/products/treedix...

lxgr 17 hours ago [-]
The reason is probably that anything faster than USB 2.0 (480 Mbit/s) and supporting power over 3A/60V will need to have an active marker, and to read that, you'll need something slightly more complex than a connection tester.

That said, these things do seem to exist at this point, as sibling comments have pointed out.

As an aside, it's a real shame devices with USB-C ports don't offer this out of the box. They need to be able to read the marker anyway for regular operation!

_rs 13 minutes ago [-]
Right? It would be great if I could plug a USB-C cable into 2 ports on my Mac and it could figure out what the cable is capable of
moray 17 hours ago [-]
On aliexpress for very cheap search for "DT3 Data Cable Detection Board Type-C". I got the one below and seems to work fine for what I needed.

https://fr.aliexpress.com/item/1005007509475055.html

Edit: This will test whether the cable is functioning properly. It will show the connections and indicate whether the cable supports only power or also data transfer. However, it won’t provide information about the USB-C cable type or its speed capabilities.

9029 16 hours ago [-]
I have been planning to get either Witrn K2 or Power-Z KM003C. If just cable testing is enough, the Treedix one is probably good.

Related: If you are looking for cables, this guy has tested a bunch (mainly for charging capabilities) https://www.allthingsoneplace.com/usb-cables-1

amelius 15 hours ago [-]
Can these instruments measure bit error rates?
amelius 15 hours ago [-]
I'd expected to see at least characteristic impedance in that table.

And some metrics on internal reflections.

mrheosuper 17 hours ago [-]
What do you mean "testing" it, reading hardcoded data from e-marker chip, or really test it?

The later would require multi-thousands dollar machine.

tom_alexander 16 hours ago [-]
I'm curious as to why it is so expensive? Admittedly I know very little about electronics, and naturally the validation testing that a cable manufacturer does is going to be more thorough, but for consumer-grade testing couldn't we just have an FPGA or microcontroller scream the fibonnaci sequence in one end and another listen for the fibonnaci sequence on the other end? Sort of like memtest but instead ramping up speed until the transmission becomes garbled.
klausa 15 hours ago [-]
120GB/s is _really_ fast.
17 hours ago [-]
Someone 3 days ago [-]
> The script parses macOS’s system_profiler SPUSBHostDataType2 command, which produces a dense, hard-to-scan raw output

I couldn’t find source (the link in the article points to a GitHub repo of a user’s home directory. I hope for them it doesn’t contain secrets), but on my system, system_profiler -json produces json output. From that text, it doesn’t seem they used that.

sorcercode 3 days ago [-]
internally uses the same root command btw. in fact this recently changed for Tahoe (as the article mentions).

started out as a shell script but switched to a go binary (which is what is linked).

procaryote 19 hours ago [-]
I hope this doesn't become a trend. Moving it to go means you need to compile it before you run it, or blindly run an uninspected binary from some random guy

It's not like the performance of this could have motivated it

lxgr 17 hours ago [-]
I'll take the minimal hassle of having to compile a go program over a complex shell script that only the author understands (if that) any day.

Performance isn't everything; readability and maintainability matter too.

timeon 16 hours ago [-]
> Performance isn't everything; readability and maintainability matter too.

Is that case for this vibe-coded thing? https://news.ycombinator.com/item?id=45513562

lxgr 10 hours ago [-]
No idea, I haven't had a look at this code in particular.

I'm just saying that I've seen several "small tools that could have been shell scripts" in Go or another more structured language and never wished they were shell scripts instead.

tensor 1 hours ago [-]
I mean, you shouldn't blindly run a shell script anymore than a binary anyways. And if you're reading the code I'd rather read Go than bash any day. That said, yes there is an extra compilation step.
JdeBP 20 hours ago [-]
Correct. But you didn't see that the source was one level up in the directory tree from the untrustworthy binary blob?

* https://github.com/kaushikgopal/dotfiles/blob/master/bin/usb...

Presumably there is a sensible way to do this in go by calling an API and getting the original machine-readable data rather than shelling out to run an entire sub-process for a command-line command and parsing its human-readable (even JSON) output. Especially as it turns out that the command-line command itself runs another command-line command in its turn. StackExchange hints at looking to see what API the reporter tool under /System/Library/SystemProfiler actually queries.

Someone 19 hours ago [-]
> But you didn't see that the source was one level up in the directory tree from the untrustworthy binary blob?

No, silly me. Shortly searched for a src directory, but of course, should have searched for a bin directory, as that’s where vibe coding stores sources /s.

NelsonMinar 18 hours ago [-]
lsusb will get you this info in Linux, but I like the idea of a little wrapper tool to make the output easier to parse.

480 vs. 5000 Mbps is a pernicious problem. It's very easy to plug in a USB drive and it looks like it works fine and is reasonable fast. Right until you try to copy a large file to it and are wondering why it is only copying 50MBytes/second.

It doesn't help that the world is awash in crappy charging A-to-C cables. I finally just throw me all away.

lxgr 18 hours ago [-]
I remember hearing it’s even possible to plug in a USB-A plug too slowly, making the legacy pins make contact first, which results in a 480 Mbps connection – despite the cable, the host, and the device all supporting superspeed!
lloeki 18 hours ago [-]
Can confirm, was victim of this.

Couldn't figure out why my 5-disk USB enclosure was so ungodly slow. Quickly I saw that it was capping suspiciously close to some ~40MB/s constant, so 480Mbps.

lsusb -v confirmed. As it happened I did some maintenance and had to unplug the whole bay.

Since the port was nearly tucked against a wall I had to find the port by touch and insert somewhat slowly in steps (brush finger/cable tip to find port, insert tip at an angle, set straight, push in) but once in place it was easy to unplug and insert fast...

This was driving me "vanilla ice cream breaks car" nuts...

andrewmcwatters 10 hours ago [-]
Destroy the whole standard. That's literally insane.
lxgr 8 hours ago [-]
That's the price of strong backwards compatibility. Otherwise, you wouldn't be able to use a USB 3 (superspeed) device on a USB 3 host port with a USB 2 cable at all.

And if you hate this, you should probably never look into these (illegal by the spec, but practically apparently often functional) splitters that separate the USB 2 and 3 path of a USB 3 capable A port so that you can run two devices on them without a hub ;)

andrewmcwatters 39 minutes ago [-]
What in the world…
Tepix 18 hours ago [-]
Why does it mention USB 3.2 (i.e. 20 Gbps) at all if it's for Macs? I thought Macs only support 10 Gbps and 40 Gbps, but nothing inbetween?

(which is inconvenient because USB 3.2 Gen 2x2 20 Gbps external SSD cases are much cheaper than USB 4 cases for now).

Also, he is calling a binary a script, which i find suspicious. This task looks like it should have been a script.

gruez 44 minutes ago [-]
>Why does it mention USB 3.2 (i.e. 20 Gbps) at all if [...]

USB-IF in all their wisdom used "USB 3.2" to refer everything from 5 gbps (USB 3.2 Gen 1×1 ) to 20 gbps

https://en.wikipedia.org/wiki/USB_3.0#USB_3.2

gattr 18 hours ago [-]
On a somewhat related note, I like the IO shield of my new MSI motherboard - the USB ports are tersely labeled "5G", "10G", "40G" (and a few lingering "USB 2.0").
cozzyd 12 hours ago [-]
One of my pet peeves is when people call a binary a script
madethemcry 18 hours ago [-]
Content wise a nice idea, but I also like the conclusion about how AI made this possible in the first place. The author itself mentions this motivation. AI is undoubtedly perfect for utilities, small (even company internal) tools for personal use where maintainability is secondary as you can ditch the tool or rebuild it quickly.

> Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

> That’s the real story. Not the script, but how AI changes the calculus of what’s worth our time.

sersi 17 hours ago [-]
I've found that to be very true. For bigger projects, I've had rather mixed results from ai but for small utility scripts, it's perfect.

But like the author, I've found that it's usually better to have the llm output python, go or rust than use bash. So I've often had to ask it to rewrite at the beginning. Now I just directly skip bash

atonse 11 hours ago [-]
Came here to say exactly this.

That all the naysayers are missing the tons of small wins that are happening every single day by people using AI to write code, that weren't possible before.

I specified in a thread a few weeks ago that we manage a small elixir-rust library, and I have never coded rust in my life. Sure, it's about 20 lines of rust, mostly mapping to the underlying rust lib, but so far I've used claude to just maintain it (fix deprecations, perform upgrades, etc).

This simply wasn't possible before.

simianparrot 17 hours ago [-]
This is a vibe coding Trojan horse article.

> That’s the real story. Not the script, but how AI changes the calculus of what’s worth our time.

Looking at the github source code, I can instantly tell. It's also full of gotchas.

Cthulhu_ 17 hours ago [-]
Ugh. I appreciate the tool and I suppose I can appreciate AI for making the barrier to entry for writing such a tool lower. I just don't like AI, and I will continue with my current software development practices and standards for production-grade code - code coverage, linting, manual code reviews, things like that.

At the same time though I'm at a point in my career where I'm cynical and thinking it really doesn't matter because whatever I build today will be gone in 5-10 years anyway (front-end mainly).

ChrisGreenHeur 17 hours ago [-]
is it worth it for everything? if you need a bash script that takes some input and produces some output. Does it matter if it's from an AI? It has to get through code review, the person who made it has to read through it before code review so they don't look like an ass.
alex_duf 17 hours ago [-]
yeah recently I needed a script to ingest individual json files into an sqlite db. I could have spent half the day writing, or asked an AI to write it and spend 10 minutes checking the data in the DB is correct.

There are plenty of non critical aspects that can be drastically accelerated, but also plenty of places where I know I don't want to use today's models to do the work.

cozzyd 12 hours ago [-]
I worked with contractor for a contractor who had AI write a script to update a repository (essentially doing a git pull). But for some strange reason it was using the GitHub API instead of git. The best part is if the token wasn't set up properly it overwrote every file (including itself) with 404s.

Ingesting json files into sqlite should only take half a day if you're doing it in C or Fortran for some reason (maybe there is a good reason). In a high level language or shouldn't take much more than 10 minutes in most cases, I would think?

alex_duf 11 hours ago [-]
regarding how long the ingestion should take to implement, I'm going to say: it depends!

It depends on how complex the models are, because now you need to parse your model before inserting them. Which means you need tables to be in the right format. And then you need your loops, for each file you might have to insert anywhere between 5 to nested 20 entities. And then you either have to use an ORM or write each SQL queries.

All of which I could do obviously, and isn't rocket science, just time consuming.

cozzyd 11 hours ago [-]
Sure, if the JSON is very complicated it makes sense that it could take a lot longer (but then I wouldn't really trust the AI to do it either...)
raincole 17 hours ago [-]
The author literally says this is vibe-coded. You even quoted it. How the hell is this "Trojan horse"? Did the Greeks have a warning sign saying "soldiers inside" on their wooden horse?
amelius 14 hours ago [-]
No, but it lacked the product safety leaflet.
simianparrot 16 hours ago [-]
Because it’s not in the title, and I personally prefer up-front warnings when generative “AI” is used in any context, whether it’s image slop or code slop
wcrossbow 17 hours ago [-]
I'm not a go developer and this kind of thing is far from my area of expertise. Do you mind giving some examples?

As far as I can tell skimming the code, and as I said, without knowledge of Go or the domain, the "shape" of the code isn't bad. If I got any vibes (:))from it, it was lack of error handling and over reliance on exactly matching strings. Generally speaking, it looks quite fragile.

FWIW I don't think the conclusion is wrong. With limited knowledge he managed to build a useful program for himself to solve a problem he had. Without AI tools that wouldn't have happened.

alias_neo 17 hours ago [-]
There's a lot about it that isn't great. It treats Go like a scripting language, it's got no structure (1000+ lines in a single file), nothing is documented, the models are flat, no methods, it hard codes lots of strings, even the flags are string comparisons instead of using the proper tool, regex compiles and use inlined, limited device support based on some pre-configured, hard-coded strings, some assumptions made on storage device speeds based on its device name: nvme=fast, hdd=slow, etc.

On the whole, it might work for now, but it'll need recompiling for new devices, and is a mess to maintain if any of the structure of the data changes.

If a junior in my team asked me to review this, they'd be starting again; if anyone above junior PRd it, they'd be fired.

nottorp 16 hours ago [-]
> Generally speaking, it looks quite fragile

I have a usb to sata plugged in and it's labeled as [Problem].

nubinetwork 20 hours ago [-]
They make a hardware device for this, it has several usb plugs on it, and two rows of LEDs that light up if the wires are all connected.
altairprime 28 minutes ago [-]
See above thread for more details and links:

https://news.ycombinator.com/item?id=45513256

basepurpose 18 hours ago [-]
yes. not that expensive.
eikenberry 2 hours ago [-]
> I was punching through my email actively as Claude was chugging on the side.

I wonder how much writing these scripts cost. Were they done in Claude's free tier, pro, or higher? How much of their allotted usage did it require?

I wish more people would include the resources needed for these tasks. It would really help evaluate where the industry is in terms of accessibility. How much is it reserved for those with sufficient money and how that scales.

30 minutes ago [-]
bediger4000 2 days ago [-]
Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

This aligns with the hypothesis that we should see and lots lots of "personalized" or single purpose software if vibe coding works. This particular project is one example. Are there a ton more out there?

emilburzo 18 hours ago [-]
+1 here, with the latest Chrome v3 manifest shenanigans, the Pushbullet extension stopped working and the devs said they have no interest in pursuing that (understandable).

I always wanted a dedicated binary anyway, so 1 hour later I got: https://github.com/emilburzo/pushbulleter (10 minutes vibe coding with Claude, 50 minutes reviewing code/small changes, adding CI and so on). And that's just one where I put in the effort of making it open source, as others might benefit, nevermind the many small scripts/tools that I needed just for myself.

So I share the author's sentiments, before I would have considered the "startup cost" too high in an ever busy day to even attempt it. Now after 80% of what I wanted was done for me, the fine tuning didn't feel like much effort.

Hasnep 21 hours ago [-]
Simon Willison's list has a few: https://tools.simonwillison.net/
sorcercode 20 hours ago [-]
nice! thanks for sharing this. as always, Simon seems to be multiple steps ahead on this game.
kayge 10 hours ago [-]
Yep! Nothing worth sharing/publishing from me, but quite a few mini projects that are specific to my position at a small non-tech company I work for. For example we send data to a client on a regular basis, and they send back an automated report with any data issues (missing fields, invalid entries, etc) in a human-unfriendly XML format. So I one-shotted a helper script to parse that data and append additional information from our environment to make it super easy for my coworkers to find and fix the data issues.
mvdwoord 19 hours ago [-]
Definitely.... I just bought a new NAS and after moving stuff over, and downloading some new movies and series, "Vibe coding" a handful of scripts which check completeness of episodes against some database, or the difference between the filesystem and what plex recognized, is super helpful. I noticed one movie which was obviously compressed from 16:9 to 4:3, and two minutes later, I had a script which can check my entire collection for PAR/DAR oddities and provides a way to correct them using ffmpeg.

These are all things I could do myself but the trade off typically is not worth it. I would spend too much time learning details and messing about getting it to work smoothly. Now it is just a prompt or two away.

pjmlp 19 hours ago [-]
I see it differently, no need to assign learning tasks to juniors that can now be outsourced to the computer instead.

This is currently the vibe on consulting, possible ways to reduce headcount, pun intended.

AceJohnny2 21 hours ago [-]
I have a bunch at work, yes. Can't publish them.

Just an hour ago I "made" one in 2 minutes to iterate through some files, extract metadata, and convert to CSV.

I'm convinced that hypothesis is true. The activation energy (with a subscription to one of the big 3, in the current pre-enshittification phase) is approximately 0.

Edit: I also wouldn't even want to publish these one-off, AI-generated scripts, because for one they're for specific niches, and for two they're AI generated so, even though they fulfilled their purpose, I don't really stand behind them.

mrguyorama 8 hours ago [-]
>Just an hour ago I "made" one in 2 minutes to iterate through some files, extract metadata, and convert to CSV.

Okay but lots of us have been crapping out one off python scripts for processing things for decades. It's literally one of the main ways people learned python in the 2000s

What "activation energy" was there before? Open a text file, write a couple lines, run.

Sometimes I do it just from the interactive shell!

Like, it's not even worth it to prompt an AI for these things, because it's quicker to just do it.

A significant amount of my workflow right now is a python script that takes a CSV, pumps it into a JSON document, and hits a couple endpoints with it, and graphs some stats.

All the non-specific stuff the AI could possibly help with are single lines or function calls.

The hardest part was teasing out python's awful semantics around some typing stuff. Why is python unwilling to parse an int out of "2.7" I don't know, but I wouldn't even had known to prompt an AI for that requirement, so no way it could have gotten that right.

It's like ten minutes to build a tool like this even without AI. Why weren't you before? Most scientists I know build these kind of microscripts all the time.

thorncorona 1 hours ago [-]
Because even though I can learn some random library, I don’t really care to. I can do the architecture, I don’t care to spend half an hour understanding deeply how arguments to some API work.

Example: I rebuilt my homelab in a weekend last week with claude.

Setup terraform / ansible / docker for everything, and this was possible because I let claude all the arguments / details. I used to not bothered because I thought it was tedious.

dotancohen 20 hours ago [-]
Who's the third? I'm assuming OpenAI and Anthropic are 1 and 2.
AceJohnny2 20 hours ago [-]
Yeah, Anthropic & OpenAI for two, the third being Google. I hear Gemini's gotten quite good.
duckerduck 18 hours ago [-]
I've created a custom "new tab" page that I have been enjoying. 95% vibes. I wrote about it here:

https://janschutte.com/posts/program-for-one.html

shepherdjerred 18 hours ago [-]
I spend an embarrassing amount of time on my homelab with Cursor

https://github.com/shepherdjerred/homelab

INTPenis 19 hours ago [-]
Absolutely. I can come home from a long day of video meetings, where normally I'd just want to wind down. But instead I spend some time instructing an AI how to make a quality of life improvement for myself.
nurettin 20 hours ago [-]
For me, claude churns like 10-15 python scripts a day. Some of these could be called utilities. It helps with debugging program outputs, quick statistical calculations, stuff I would use excel for. Yesterday it noticed a discrepancy that lead to us finding a bug.

So yes there is a ton but why bother publishing and maintaining them now that anyone can produce them? Your project is not special or worthwhile anymore.

crimsoneer 17 hours ago [-]
Are people not vibe coding lots of tiny things? I certainly am.

Last weekend I had a free hour and built two things while sat in a cafe:

- https://yourpolice.events, that creates a nice automated ICS feed for upcoming events from your local policing team.

- https://github.com/AndreasThinks/obsidian-timed-posts, an Obsidian plugin for "timed posts" (finish it in X minutes or it auto-deletes itself)

ThrowawayTestr 21 hours ago [-]
I've used chatgpt to make custom userscripts for neopets but I've never published them.
29 minutes ago [-]
citizenpaul 20 hours ago [-]
Cross compiling is not unique to golang. It does make it pretty easy though.
consp 19 hours ago [-]
Why cross compile if it's made specifically for macos?
procaryote 19 hours ago [-]
Why compile it when it could be a bash script?
dotancohen 18 hours ago [-]
Why a bash script when it could have been a one-liner?
larodi 16 hours ago [-]
> Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

adding to the theory that soon we gonna prefer to write, rather download ready-made code, because the friction is super low

usrusr 14 hours ago [-]
Arguably a one-off written by the cloud would still be downloaded to the place where it eventually runs.
29 minutes ago [-]
diodoe 16 hours ago [-]
Interesting. Is there a way to adapt this for Linux or Windows? Many users, not just Mac users, face issues with USB-C cables. Practical cross-platform tools could be very helpful.
razakel 11 hours ago [-]
Linux: lsusb -tv

Windows: There's an example in the WDK here: https://github.com/Microsoft/Windows-driver-samples/tree/mai...

11 hours ago [-]
vel0city 3 hours ago [-]
A useful tool for debugging USB devices on Windows: https://learn.microsoft.com/en-us/windows-hardware/drivers/d...
19 hours ago [-]
SomeoneOnTheWeb 19 hours ago [-]
Side question : what font are you using in your screenshots? I find it really nice looking
spondyl 18 hours ago [-]
Commit Mono apparently: https://kau.sh/blog/commit-mono/
coldtea 17 hours ago [-]
> Go also has the unique ability to compile a cross-platform binary, which I can run on any machine.

Huh? Is this true? I know Go makes cross-compiling trivial - I've tried it in the past, it's totally painless - but is it also able to make a "cross platform binary" (singular)?

How would that work? Some kind of magic bytes combined with a wrapper file with binaries for multiple architectures?

heinrich5991 16 hours ago [-]
Not what OP meant, but there's a project doing what you ask for: https://justine.lol/cosmopolitan/. It's quite interesting. :)
basepurpose 18 hours ago [-]
i don't understand why do competent people need to mention that they vibe coded something.
sand500 18 hours ago [-]
It's a disclaimer that they have no idea what they code does.
jasode 18 hours ago [-]
It's just because vibe coding is still "new" and various people have mixed results with it. This means that anecdotes today of either success or failure still carry some "signal".

It will take some time (maybe more than a decade) for vibe coding to be "old" and consistently correct enough where it's no longer mentioned.

Same thing happened 30 years ago with "The Information Superhighway" or "the internet". Back then, people really did say things like, "I got tomorrow's weather forecast of rain from the internet."

Why would they even need to mention the "internet" at all?!? Because it was the new thing back then and the speaker was making a point that they didn't get the weather info from the newspaper or tv news. It took some time for everybody to just say, "it's going to rain tomorrow" with no mentions of internet or smartphones.

DonHopkins 54 minutes ago [-]
It's like how in space, they just call the "space bar" the "bar".
stefanfisk 18 hours ago [-]
I wouldn’t be surprised if it’s actually a plus in the eyes of possible new employers these days.
basepurpose 18 hours ago [-]
vibe coding in my understanding is losing/confusing the mental model of your codebase, you don't know what is what and what is where. i haven't found a term to define "competently coding with ai as the interface".
stefanfisk 17 hours ago [-]
I agree. But management types bedazzled by AI probably see these kids as the future leaders of our profession.
sharkjacobs 18 hours ago [-]
I mean, they seem to address that pretty directly in the post

> Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

> That’s the real story. Not the script, but how AI changes the calculus of what’s worth our time.

raincole 18 hours ago [-]
"My static blog templating system is based on programming language X" is the stereotypical HN post. In theory the choice of programming language doesn't matter. But HNers like to mention it in the title anyway.
dist-epoch 17 hours ago [-]
For the same reason competent people need to mention that X utility was (re)written in Rust.
timeon 16 hours ago [-]
I would guess the reason there is opposite. Like code that even newcomer can safely edit.

But in general you are right. The article was for developers so mentioning the tool/language/etc. is relevant.

cratermoon 3 hours ago [-]
"yes, vibe coded. Shamelessly, I might add"

I wouldn't trust this as source code until after a careful audit. No way I'm going to trust a vibe-coded executable.

pmlnr 19 hours ago [-]
Imagine if we printed the capabilities on the cables, like we used to.
chedabob 18 hours ago [-]
USB4 is supposed to have proper labels: https://www.pcworld.com/article/2602229/better-usb-labels-ar...

I don't know if that necessarily helps though, because I've seen USB3 cables that seemingly have the bandwidth and power capabilities, but won't do video.

jeroenhd 18 hours ago [-]
Capabilities are printed on the side of ethernet cables and the text printed on the cable rarely seems related to the actual capabilities of the ethernet plug. Some cat5e cables are rated for 1000mbps but will happily run 5000mbps or 2500mbps (because those standards came after the text on the cable was last updated), other "cat6" cables are scams and struggle achieving gigabit speeds without packet loss.

Plus, it doesn't really matter if you put "e-marker 100W USB3 2x2 20gbps" on a cable when half those features depend on compatibility from both sides of the plug (notably, supposedly high-end devices not supporting 2x2 mode or DP Alt mode or charging/drawing more than 60W of power).

Dylan16807 17 hours ago [-]
USB cables push the boundaries of signal integrity hard enough that unless it's a 1 foot passive cable you're not really going to get any surprise speed boosts.

And when they upped the max voltage they didn't do it for preexisting cables, no matter what the design was.

> those features depend on compatibility from both sides of the plug

That's easy to understand. Cable supports (or doesn't support) device, it can't give new abilities to the device. It doesn't make labeling less valuable.

mrheosuper 17 hours ago [-]
We used to what ? Back in the day there are countless cables with no printing. Sometime the only way to know if they are 3.0 or not if checking if they have blue connector.
threatripper 19 hours ago [-]
That would only confuse potential buyers. You have to design everyday products for non-technical people.
withinboredom 19 hours ago [-]
Not only that, it doesn’t stop unscrupulous manufacturers from just printing whatever they want.
Dylan16807 17 hours ago [-]
How could a max speed rating possibly be worse than a blank plug end?
thefz 20 hours ago [-]
Vibe coded, no thanks.
self_awareness 19 hours ago [-]
Vibe coding. Producing code without considering how we should approach the problem. Without thinking where exactly is the problem. This is like Electron, all over again.

Of course I don't have any problems with the author writing the tool, because everyone should write what the heck they want and how they want it. But seeing it gets popular tells me that people have no idea what's going on.

basepurpose 18 hours ago [-]
if the author knows what they're doing and understand the model of the code at least, i don't understand the reason mentioning that it was vibe coded. maybe declaring something is vibe coded removes part of the responsibility nowadays?
self_awareness 18 hours ago [-]
Someone once told me that their mistake wasn’t theirs, but rather it was ChatGPT being wrong.

I think you have a good point about why people say it was vibe coded.

It might also be because they want to join the trend -- without mentioning vibe coding, I don't think this tool would ever reach #1 on Hacker News.

drcongo 17 hours ago [-]
HN guidelines say one shouldn't question whether another commenter has read TFA, so I won't do that. But TFA explains exactly why it was vibe coded, and exactly why they're mentioning that it was vibe coded, which is that that was the central point of TFA.
mrheosuper 17 hours ago [-]
And why should they care what's going on ?

Do you care about your binary code inside your application, or what exactly happen, in silicon level, when you write "printf("Hello World")" ?

self_awareness 12 hours ago [-]
Yes.

I verify dynamic linking, ensure no superfluous dylibs are required. I verify ABI requirements and ensure a specific version of glibc is needed to run the executable. I double-check if the functions I care about are inlined. I consider if I use stable public or unstable private API.

But I don't mean that the author doesn't know what's going on in his snippet of code. I'm sure he knows what's going on there.

I mean that upvoters have no idea what's going on, by boosting vibe coding. People who upvote this are the reason of global software quality decline in near future.

mrheosuper 8 hours ago [-]
All your stuff is still pretty high level compared to the pure metal inside CPU. Do you which register the compiler decied to use to store this variable, or does the CPU will take this execution branch or not ?

It's all abstraction, we all need to not know some low level layer to do our job, so please stop gatekeeping it.

self_awareness 7 hours ago [-]
What's your point? That we shouldn't care about anything at all because there is 1 thing we truly shouldn't care about?

That we shouldn't care about spending $1 for a sandwich therefore managing home budget is pointless?

rmunn 21 hours ago [-]
Please update the title to mention that this is MacOS only; I got excited to try this out, but I only have laptops running Linux and Windows.
19 hours ago [-]
sorcercode 20 hours ago [-]
yeah sorry about that. I don't have access to a Linux/Windows machine.

if I got a hold of the output and commands run, would gladly modify it.

jasonjayr 20 hours ago [-]
> lsusb -v

On Linux that produces a lot of info similar to the macos screenshots, but with values and labels specific to the Linux USB stack.

I wonder if AI could map the linux lsusb output to a form your tool can use...

lanyard-textile 20 hours ago [-]
Is it really vibe coding if you’re testing it on the target machine? ;)
adastra22 19 hours ago [-]
Yes, I think? “Vibe coding” is more about whether you are reading/reviewing the generated code, or just going with it, afaik.
kenperkins 20 hours ago [-]
fwiw, it would take 10 minutes to download a linux docker image and build it in go to test. The harder part is getting the information from a different API on Linux.
RKearney 20 hours ago [-]
This post is 12 minutes old. Have you finished yet?
skissane 17 hours ago [-]
A Linux Docker image, probably doesn’t have any USB devices exposed to it-well, it depends on exactly how you run it, but e.g. if you use Docker Desktop for Mac, its embedded Linux VM doesn’t have any USB passthrough support. This is the kind of thing where a physical Linux host (laptop/desktop/NUC/RPi/etc) is much more straightforward than running Linux in a VM (or as a K8S pod in a datacenter somewhere)
febusravenga 19 hours ago [-]
... and orders of magnite more time to properly access USB devices to some arcane VM not in your control
tomhow 18 hours ago [-]
We've updated the title now, thanks.
swyx 19 hours ago [-]
what do you mean, all developers only use macs!

(/s)

jedbrooke 2 hours ago [-]
I feel like we kind of got monkey’s paw’ed on USB-C. I remember during the 2000’s-2010’s people were drowning in a sea of disparate and incompatible connectors for video, audio, data, power, etc. and we’re longing for “One Port To Rule Them All” that could do everything in one cable. We kind of got that with USB-C, except now, you see a USB-C cable/port and you have no idea if it supports data only, data + charging, what speeds of data/charging, does it support video? maybe it does, maybe it doesn’t. at least it can plug in both ways… most of the time
op00to 40 minutes ago [-]
I bought the coolest, fattest USB-C cables, and I failed to read the description enough to hear they only support USB 2 speeds! They work fine for the specific use I have for them, but I wish I could use ‘em for everything!
bluedino 2 hours ago [-]
I was just thinking the other day, if the connectors had been USB-C from the start.

No Type-A, no Type-B, no Mini, no Micro...

op00to 39 minutes ago [-]
insert futuristic city picture with flying cars here
alanh 55 minutes ago [-]
And?
31 minutes ago [-]
32 minutes ago [-]
edarchis 19 hours ago [-]
A similar tool, open source and portable Linux/Mac/Windows is Cyme. It works wonderfully well.

https://github.com/tuna-f1sh/cyme

Slartie 18 hours ago [-]
Thanks for mentioning this, I spontaneously love it!
crewcuthair 13 hours ago [-]
[dead]
self_awareness 18 hours ago [-]
Guys, please, don't upvote this. If this topic will beat the "Physics Nobel Prize 2025", I will lose my faith in HN.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 01:13:21 GMT+0000 (Coordinated Universal Time) with Vercel.