I'd like to point out that the things he says he's doing instead of marketing are, in fact, marketing. It's "guerilla" marketing, and it's being paid for with the writer's time. Nothing wrong with that, just don't confuse "marketing" with "advertising".
The author does not say how much does maintaining the service cost and what is the long term plan. As others have referred to already, a similar existed before and ended for a simple reason – no point in maintaining it with constant loss and no clear revenue plan.
Yes but what's your revenue? Please consider going transparent (i.e. http://www.transparentstartups.com/) and making an interview with one of my fav websites indiehackers.com
Providing revenue and financial metrics are really get eyeballs as this info is rarely available for early stage companies. Balasmiq used this very well, they released early on monthly financial status blogs, which almost all managed to reach front page/top of HN. So, take it as another guerilla marketing channel.
I'm on mobile currently and apparently cannot look at reviews for Chrome extensions. Cookie Inspector is open source[0] according to the extension page, so you could try finding any adware-related code yourself, couldn't you?
Chrome does nothing to guarantee that an extension is built from the source that it claims.
You can verify that source, enable external sources in Chrome, ('developer mode' or somthing) and then install it - but that says nothing about what's on the Chrome store.
I think a more useful way to think about it is "Know your users, and your channel." There are some user populations - and developers are usually one of them - that are virulently anti-advertising. Any paid channel usually earns instant distrust from them. Putting money into marketing spending can have a negative ROI for them, because it has a signaling effect on the brand that says "Our product quality isn't good enough for us to get users without paying for them."
Then there are other user populations - most e-commerce is like this - where there is no such signaling effect, and they are happy to check out new products, regardless of how they hear about the product. For these markets, it's silly to ignore paid channels; you're just leaving money on the table.
I think Apple's paid marketing efforts are a counter example to your argument. Ineffective advertising is ineffective. As a tautology this statement is pretty unremarkable.
I first found out about ipinfo from a Google search looking to solve this problem. Haven't pulled the trigger on using it, but it's been on the back of my mind for awhile now.
For me, their marketing happened to be their placement on Google. I don't really know how you'd pursue paid channels on this one (outside of SEM and SEO).
It's a solution to a known problem. Unlike, say, the Apple Watch which has to first convince you have a problem... (This coming from a guy who's very interested in buying a watch as I train for a marathon.)
For developer tools? Apple certainly spends a ton on brand advertising for consumers, but most developers I know develop for Apple because they want to reach Apple consumers. (I spend a year doing smartwatch apps before giving up on the platform...the only reason I was developing for Apple Watch is because that's where the users were.)
As a single example that immediately came to mind: WWDC. If that isn't a massive marketing channel directly targeted at developers I don't know what is.
They're anti-advertising on paper. Are you telling me no one's ever run a successful ad campaign targeted at developers? I don't believe that for a minute.
I wouldn't go so far as "never", but historically, developer tools with lasting growth tend to spread via word of mouth & community recommendations rather than ad campaigns. See eg. Ruby/Rails, Python and supporting ecosystem, JQuery, Node.js, Express, Postgres, Vue.js, GitHub before they took VC, Parse before they were acquired by Facebook, etc. Even Google - they had distribution deals starting early on, but they were very reluctant to spend money on advertising up until about 2010 (I remember watching the first Google SuperBowl commercials at TGIF, and hearing a lot of questions about why were spending money on advertising and what it meant for organic growth).
When ad campaigns are run, they usually work by targeting programmers' bosses, encouraging the organization to purchase & standardize on one solution and then forcing developers to adopt it via management fiat. That's the route taken by Java, MongoDB, Oracle, .NET, many of the companies in the Hadoop ecosystem, etc. It certainly works - these are big companies - but it requires a consultative enterprise sales team that can work to get the solution deployed across the whole enterprise. The author obviously doesn't have the resources for that.
The middle ground, where you provide a developer tool with $100-200/year CLTV and hope to get it distributed via paid advertising, is a very difficult place to occupy. That's the area occupied by FogBugz and RethinkDB and Sandstorm and many analytics providers. Most of them go the community-building route, and it's still very difficult. Only company I can think of that's thrived here is GitHub, and they went the word-of-mouth, community-building route first. Companies like Jetbrains, Atlassian, Rational, etc. thrive as well, but they've got large enterprise sales teams that standardize a whole organization on their products.
Sold to the boss, not the developer. The best devs I know actually get very suspicious when they see a startup full of Aerons, because it means that management is more focused on appearances than frugality. They look for chairs that are comfy but cheap, like something you'd get used off Craigslist, or if you do have Aerons, they'd like to hear that you picked them up from a bankruptcy auction of another startup.
That was just an example. Besides, I'm not sure what you can infer about developers from the beliefs of the "best devs", considering they're only a fraction of the whole.
The point is that developers don't just buy software. They're marketed stuff constantly, like everybody else. To suggest they're immune to it seems a little naive to me.
Not if they had no money to spend on marketing in the first place. Saying no money was spent on marketing is advertising strategies that might work if someone with low capital who is trying to boostrap a company.
That's a fair and charitable interpretation. If we relax the "no money" assumption a little, one might then question if it's the wisest capital allocation (why not spend, say, $200 on marketing and see if that helps?).
> I built the API in a few hours, posted the answer, and forgot about it — until a few months later I got an email saying my server usage was off the charts. I’d been getting millions of requests per day.
that they basically just set it up as a side-project and answered a couple questions on SO about it only to be confronted with its explosive growth after a few months.
One could then interpret the author in that situation as thinking "don't need marketing, who's gonna use this little thing I made anyway. [a couple months later] Oh shit, this blew up and I didn't even need to spend anything on marketing it besides plugging it on Stackoverflow occasionally".
The takeaway isn't that he could make more money from marketing directly to developers, but that sites like StackOverflow give him direct access to developers right at the _very_ moment their pain point is at maximum velocity.
Someone searching for "IP API" on StackOverflow and having his site rank in the top result(s) is the same as searching for "dresses" on Google, but without having to pay for search placement and getting 100% targeted traffic.
They may just be specifying the amount of marketing.
If the headline simply reads "Taking $foo to 250M API req/day" and then it turns out that all of the traffic came from a $4MM ad spend, it's a lot less interesting to those without $4MM to spend.
We saw a 500,000% gain in sales by marketing, but to be fair we were small and had no idea what we were doing.
I was as skeptical as they come, but a good marketing team basically launched us from nobodies into LEO. Yeah we had a great core product but still... gotta know how to sell it.
The post title is misleading. The author has clearly invested substantial amounts of time, and some money (because the brochureware landing pages presumably aren't hosted at zero cost, they're on EC2). What they seem to be suggesting is they didn't buy ads. So it's a short and incomplete list of basic tips for identifying a need & creating a buzz via online communities; don't get hung up on the title.
Why? People hate ads. If your service grows with no spending on ads, it means you're getting traffic from word-of-mouth. And that indicates that you've built a great service that people are happy to promote for you. It just shows your product has virtue and it's naturally taken as a compliment.
I'd compare it to having a kid. If you have money you can probably buy your kid into a good school and hook them up with a good job when they graduate. Or you can set them up with the right foundation and watch them succeed on their own.
I did the same as the OP with a project of mine. I never spent a penny on ads and it's grown into a very popular charting/trading tool among cryptocurrency traders. I also take pride in the fact that I grew it organically.
What you're addressing here seems orthogonal to me. You can use "marketing" for a crappy product, and you can use it for a good product. All combinations exist.
What's puzzling is that you would deprive yourself from making your great product known to more people simply because you want to do it the "hard way."
Because marketing invariable reaches people who are uninterested and just leads to wasted bandwidth.
Also, it leads to added risk to building something people don't want and only engaged because of the marketing drowned out finding what they really wanted.
That resonates Peter Theil in Zero to One saying, Silicon Valley Developer-Entrepreneurs believing Good Products sell by itself with no sales and marketing effort.
>I'm not sure why people are proud to do things without spending money on marketing.
being successful without marketing probably means you were successful due to word of mouth. Its hard to be successful due to word of mouth when your product sucks.
What it does is show where every asset on a web page is loaded from. It allows you to visualize how many different requests go into building just one web page. While it's gotten much better, the Houston Chronicle (https://chron.com) used to make about 500 individual requests to build its home page. It's down to about 125.
It's best to run it across two different monitors, with IP Request Mapper on one monitor and your "normal" browser window on another. Then enter any URL and watch the map start populating based on the geolocating every request made by the page.
But it's projects like ipinfo.io that make these other things possible. Standing on the shoulders of giants and all that...kudos to you, coderholic.
That's the same motivation that started me to build https://urlscan.io. I wanted an easy way for everyone to visualise the amount, size and destinations of the various HTTP requests that a single page-load triggers. Incidentally I also created a tool that is being used by a lot of Security / Phishing researchers. If you ever want some inspiration for additional IP / domain annotation sources, check it out. I should really do a "Show HN" soon ;)
This website contacted 35 IPs in 7 countries across 24 domains to perform 302 HTTP transactions. Of those, 51 were HTTPS (17 %) and 35% were IPv6.
The main IP is 92.123.94.227, located in European Union and belongs to AKAMAI-ASN1.
In total, 4 MB of data was transfered, which is 9 MB uncompressed. It took 2.51 seconds to load this page. 16 cookies were set, and 42 messages to the console were logged.
Not sure if you are asking me or heipei. For my project, it shows the requests on a map, where the requests in dev tools aren't geolocated and mapped. Same basic principle, just a different view.
I'd call it more interesting than helpful (not that it's not helpful). It gives insight into how a web page is actually built -- while I'm sure most readers here understand it, that's not true for the general population.
It might make developers think twice about how they build sites if they could see how overly complex they get. As I mentioned, the Houston Chronicle site used to require about 4x the number of requests as it does now so someone did some optimization.
Happy user here. My GF came up to me and asked if I could somehow get country names for the ip addresses she had of her survey respondants. I Googled and found this neat little API. True, I could have downloaded the raw databases from elsewhere and worry about the SQL I need, whether the data is recent or ancient or even correct. I decided it was an overkill for my need, and just used this API in a throttled(1 req/s) mode and left it overnight. If I need this IP to Location need again, I'd happily pay for this API.
- Database needs to be distributed to your servers
- Database can become out of date easily
- Database lookup requires going to local disk and having a relatively fast access path/cache for lookups
- In general, a local database requires a large amount of effort compared to just running a curl in your PHP code.
If you are actually going to use a database, the proper solution does not look like "put it on your webservers" anyway, it looks like "put it on a separate service with a fast caching layer" etc etc. So in other words, the proper solution to decouple yourself from a 3rd party API is to... build a 1st party API.
In other words, not a 20 minute job. For small shops, a quick curl during the page load is a 20 minute job.
There are many ways to quickly get and use a GeoIP DB like MaxMind, most requiring way less than 20 minutes. This is just one example of something I started using recently:
docker run -p 8080:8080 -d fiorix/freegeoip
curl localhost:8080/json/1.2.3.4
The OP was "baffled" why anyone would ever use this API. Clearly, the API's success shows that sometimes this tax you highlight is well worth it. (That was my point as well.) Nobody is claiming the tax doesn't exist, just that it shouldn't be baffling that a rational actor would choose to pay the tax in exchange for the corresponding benefits.
(Not to mention with very minimal effort you can usually avoid the majority of the specific tax of latency you mention, by doing things like parallelizing the request with other work or doing it asynchronously to the user's interface.)
It is kinda a 20 minute job. On production, where I work, we just bake the maxmind DB into our app server images (it updates when we redeploy - which is a couple times a week) and use the maxmind client to read from that DB file.
In this case the geoIP database is like a 1MB CSV file, provided for free to the entire world by maxmind (a major network provider).
You can put give that file to many servers like nginx/apache out of the box, they will start adding a header with the country and the city of the client.
What this service is doing is effectively looking up the ip block in that csv file and returning the result as JSON.
Do you think free IP DB providers insert "fictitious entries" [1] to identify breach of TOS like this, similar to what happened between Google/Bing a few years ago [2]?
yeah, especially because "[they] built the API in a few hours".
having hosers abuse your free geoip service listed off the first hit from google is nice but the data being provided can't just be "hacked together" :P.
Yeah, that would be totally hands off. But I believe you'd have to ensure that your requests didn't timeout, (3 seconds in lambda) and in this example of 10ms response times I couldn't see any issue here. If you're into python, checkout Chalice, it's being built as a "flask" like interface on top of AWS Lambda. https://github.com/awslabs/chalice
Neat, congratulations! I know a few people that were active in that space and none of them managed to make it profitable and they all faded out again. It's important that services like these exist and even more important that they are viable businesses otherwise you are building on quicksand.
He could have saved up money and be living off of his savings while working full-time, and the amount of money that he is being paid by his customers might be less than the sum of his business and/or personal expenses. Not saying that such is the case here but just pointing out that it's possible.
That's interesting. It's almost exactly the same. I did a trademark search, and although "Podio" is registered as a word mark, the logo design is not registered. So IPInfo is probably in the clear, but they may want to consider a new logo.
Like this [1]? These custom results are usually limited to some countries. Like when you search for "color picker" [2] and "bubble level" [3] on mobile. Google is known for testing these features in their services in specific locations so I wouldn't be surprised if people from other countries cannot see them.
Strange, both google.com and google.de show a info box with my IP for the query "my ip" for me (from Germany). Does not work with "meine IP" but that is more to type anyways.
I've been running https://jsonip.com for years. Been serving millions of requests a day for most of that time. Doesn't really show up in searches well because it's just an API.
I run a few small services, nothing of this scale, but one thing to bear in mind is that it's easy to pay for a lot of little side projects if they have virtually no costs.
The one cited simply echoes back your IP. That's it. How cheaply could you do that and how many requests per second could you handle on one small VPS?
Example, I recently ran https://www.tactical2017.com/ which is a tactical voting website for the UK general election. The cost for serving that whole website to 2.6 million people over 5 weeks, and 650k people in the last day and a half, was $20.
I use linode. Been a loyal customer from years. Basic service is $40 a month with $10 for automated backups. Fits within the bandwidth contraints. I've also spent a lot of time optimizing the hell out of the server.
I'm employing a similar strategy for my library https://github.com/joelgriffith/navalia as I couldn't find any solution to manage headless chrome (plus the API for that protocol is pretty heavy).
Building for what folks want, even developers, is so obvious that I think we often forget about it. It's also not as glamorous as self driving cars or rockets, so gets discredited easily.
Kudos to you guys for building this. There is always a lot of scepticism from people on "why would anyone pay for this" . Reality is not everyone has the time or resources to build their own kit. There are literally 1000s of businesses on the internet that that are in the business of selling "time" or timesavers and removing the risk of maintenance, ongoing support.
Keep improving this and with the rise of web personalization, the demand will continue to grow.
Congrats. I m not sure but ipinfo could be very interesting to startups and programmers. So a good idea could be writing attractive articles and posting them on HN and Reddit programming and some other subreddits. That would bring more customers with zero marketing.
This has worked well for me, too. I saw an influx of "How to offer a time-based trial version on Android" on SO and developed a trial SDK as an answer: https://www.trialy.io/
Given the fast lookup time, it would be useful if you could provide a JS API fot synchronous loading.
Essentially, a blocking script in the dom <script src="...api.js" /> that prepopulates the window object. With clever error handling, this could improve perceived performance significantly.
A few questions:
1. What differentiates you from ip-api.com and other providers?
2. Do you use MaxMind?
3. Is there an option for no-throttling? 100s of simultaneous requests?
I aggregate multiple IP databases for my SaaS (https://www.geoscreenshot.com) and I need highly performant / reliable IP look ups.
> Our servers use latency based DNS routing to serve over 90% of all requests handled in less than 10ms.
What exactly does that actually mean though?
Does it mean that processing time at your server is 10ms, or 10ms to time to first byte, or something else?
Giving it a quick test, I generally get the actual JSON result in around 400ms. The lowest I got was 200ms, the highest around 1000ms. It didn't seem to make any difference if I used the HTTP endpoint instead of the HTTPS one.
I see it's still a thing. Back in high school, some ten+ years ago, I coded up an 'ip2country' website. Not sure why, there were dozens of those. I guess I had a free domain and a lot of time on my hands. I put some Google AdSense on it and let it go. I checked my AdSense account some six months later and found out I was cashing $20/month. Easiest money I've ever made.
well currently my location is basically totally wrong.
from https://www.iplocation.net/ I've only seen one service that tracks my location 100% correct (correct village), all the others are 200 or more km away from my real location.
It just depends on the database they are using. Anyway, nowadays it's getting harder and harder to detect accurate location from the IP address (many users are on 4G or behind a proxy).
Yeah, and features like Data Saver on mobile Chrome will quietly proxy your requests through Google servers. Not sure if G forwards the proper X-FORWARDED-FOR headers (or if all IP detection services read them).
The author says I took even though this was pure luck and coincidence. Attribution bias is strong in this one.
However, it is important to acknowledge that he did put himself into a position where he was available to become lucky (= he built the API and linked to it).
That's crazy the details like lat/long, what if proxy or where does that data even come from? ISP? Or you take time to build it out ie research. At any rate cool.
Although I must admit, I'm a bit surprised as to why anyone would pay for this, as several HN readers have listed atleast 5 other free (and some self-hosted) alternatives here...
Several HN readers have also pointed out that some of the free solutions have shutdown because of lack of funding.
When money is on the table, user may expect that the provider is "more" answerable than a free service.
Just some rough calculations. Assuming the worst-case scenario, everyone in the highest tiers (the cheapest per request), 250M daily requests means he makes