Hacker Newsnew | past | comments | ask | show | jobs | submit | depierre's commentslogin

One of my personal favorites. I've used it for parsing SAP's RPC network protocol, reverse-engineering Garmin apps [0], and more recently in a CTF challenge that involved an unknown file format, among others. It's surprisingly quick to pick up once you get the hang of the syntax.

The serialization branch for Python [1] (I haven't tried the Java one) has generally done the job for me, though I've had to patch a few edge cases.

One feature I've often wished for is access to physical offsets within the file being parsed (e.g. being able to tell that a field foo that you just parsed starts at offset 0x100 from the beginning of the file). As far as I know, you only get relative offsets to the parent structure.

0: https://github.com/anvilsecure/garmin-ciq-app-research/blob/...

1: https://doc.kaitai.io/serialization.html


SAP can mean so many things that it's easy to get lost in the weeds, and I'm just talking about getting familiar with their landscape... While working on that post, I found new vulnerabilities that SAP is now addressing.

I'll be honest, I've never been on the other side dealing with red tape. It'd probably drive me mad. But from the researcher/consultant side, it's definitely gotten easier to report vulnerabilities. Vendors now have security contacts, coordinated disclosure policies, and even bug bounty programs. Not all vendors, of course. But compared to 10 years ago, it's night and day.


From personal experience, CERT advisories can help cut a lot of red tape. A lot of the wishful thinking and inertia evaporate once the public disclosure goes out.

That is a big part of why there’s so much support for the disclosures. People like me and GP see how little progress gets made without the “Press”.


It's really about striking a balance: giving vendors a fair chance to patch, while also not leaving users in the dark indefinitely. That's also why the 90-day disclosure policy has become common in the industry (e.g., Google's Project Zero). I've had cases where I tried reaching out via email, LinkedIn, Twitter, you name it, and got radio silence for months and months. Then, when you make the difficult decision to go public, the vendor finally reacts... That sudden urgency only shows up when there's a bit of spotlight.


Your remark seems to match what I've observed during the reverse engineering part of the project. With magic constants like `0xc0debabe` [0] or opcodes like `canhazplz` [1] that you would expect more from a student CS project for instance.

[0]: https://github.com/anvilsecure/garmin-ciq-app-research/blob/... [1]: https://github.com/anvilsecure/garmin-ciq-app-research/blob/...


Yeah, exactly. THAI_SPICY_HOT was just one of many signals that I got that the MonkeyC project could have very much started out as an intern project or similar.

The list of languages the documentation claims MonkeyC takes inspiration from also denotes a certain type of programmer background:

> C, Java™, JavaScript, Python™, Lua, Ruby, and PHP all influenced the design for Monkey C

(source: https://developer.garmin.com/connect-iq/monkey-c/)

If I asked an embedded hardware expert to design a novel programming language for my highly resource-constrained wearables platform, I would be very surprised if these were the language touch points they used as their references in the design brief.


Author here. Fair question!

They get access to the internet via the Garmin Connect companion app. But if you're asking to know if they can be exploited from the internet, that's not what we showed yeah.

The vulnerabilities we've disclosed require a malicious app to be installed (e.g. from the CIQ app store) so let's not cry wolf.

What I think this project highlights and what we should remember is the current level of security of Garmin devices.

GarminOS deploys none of the security mitigations one would expect in modern devices (let's exclude crappy IoT devices flooding the market). No stack canaries, no W^X, etc. It does not implement isolation between user-supplied code and the rest of the OS either. And their C code base does not appear to receive much scrutiny in terms of security review.

It would be much easier to exploit the watch (e.g. sending a malicious message to the user's phone that sends it to the watch to show the notification) than exploit the user's smartphone. And this could be performed from the internet.


Indeed there have been software defects that caused Garmin watches to crash when displaying certain text messages.

https://forums.garmin.com/outdoor-recreation/outdoor-recreat...


Do you have thoughts on the NFC and particularly Garmin Pay features?

I wonder if these are secured differently or merely obscured behind the encrypted firmware on newer models.


It's not something I've encountered yet so I don't have any insight to share. I would be surprised if they were secured differently but I'm purely speculating here.


Author here.

They did start encrypting the firmware of their latest devices. I noted that the firmware images for Forerunner 55, 945 and 955 were encrypted. Most likely others are as well.

In the live demo I did at Hack in the Box a couple of days ago (slides available [0]) I've shown how to exploit one of the vulnerabilities to read the memory of the Forerunner 55, making it possible to dump the firmware unencrypted. The CIQ demo app is also on our GitHub repo [1].

[0]: https://conference.hitb.org/hitbsecconf2023ams/materials/D2T... [1]: https://github.com/anvilsecure/garmin-ciq-app-research/tree/...


Author here. Thanks!

I am surprised as well, especially considering that their compiler does little to actually no optimization. For instance, it won't remove dead code or unused variables. These seem like low hanging fruits that could save memory and cycles on low power devices.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: