But, remember, even responsibly reporting a security vulnerability could end up in taking legal actions against you.
- Source Code of Instagram website
- SSL Certificates and Private Keys for Instagram
- Keys used to sign authentication cookies
- Personal details of Instagram Users and Employees
- Email server credentials
- Keys for over a half-dozen critical other functions
- The Sensu-Admin web app running on the server contained a hard-coded Ruby secret token
- The host running a version of Ruby (3.x) that was susceptible to code execution via the Ruby session cookie
Exposed EVERYTHING including Your Selfies
- Instagram's source code
- SSL certificates and private keys (including for instagram.com and *.instagram.com)
- API keys that are used for interacting with other services
- Images uploaded by Instagram users
- Static content from the instagram.com website
- Email server credentials
- iOS/Android app signing keys
- Other sensitive data
"To say that I had gained access to basically all of Instagram's secret key material would probably be a fair statement," Weinberg wrote in his blog. "With the keys I obtained, I could now easily impersonate Instagram, or any valid user or staff member. While out of scope, I would have easily been able to gain full access to any user’s account, [personal] pictures and data."
Responsible Disclosure, but Facebook Threatens Lawsuit
"Condoning researchers going well above and beyond what is necessary to find and fix critical issues would create a precedent that could be used by those aiming to violate the privacy of our users, and such behavior by legitimate security researchers puts the future of paid bug bounties at risk," Stamos added.
Here's the full statement by Facebook:
We are strong advocates of the security researcher community and have built positive relationships with thousands of people through our bug bounty program. These interactions must include trust, however, and that includes reporting the details of bugs that are found and not using them to access private information in an unauthorized manner. In this case, the researcher intentionally withheld bugs and information from our team and went far beyond the guidelines of our program to pull private, non-user data from internal systems.
We paid him for his initial bug report based on the quality, even though he was not the first to report it, but we didn't pay for the subsequent information that he had withheld. At no point did we say he could not publish his findings — we asked that he refrain from disclosing the non-public information he accessed in violation of our program guidelines. We remain firmly committed to paying for high quality research and helping the community learn from researchers' hard work.