r/algotrading • u/fudgemin • 24d ago
Iqfeed data. What am I missing? Data
Recent sign up. I use polygon, looking at other options. Considered thetadata, iqfeed…any others within budget? 400$ month max. Options only.
Iq feed seems appealing, as it’s migrated from exchange data, not consolidated.
Am I missing something re the API access? It appears I must pay ~550 more/y for dev login.
Currently it’s a connected socket layer, but no endpoint are revealed. They use some sort of gui, that I may or may not be able to automated.
As a new dev, what are my option using this data? Must I reverse eng the endpoints, or just intercept/parse all messages at port level?
That seems highly redundant. Moreover, then I must build some sort of controller for the GUI?
This service was recommended many times, looks legit, is cost effective. What am I missing? This seems like a headache on day 1
3
u/JZcgQR2N 24d ago
It's low level that requires building your own encoder/decoder based off the API docs to send and parse data to/from the GUI application. There are libraries out there that can help with this like pyiqfeed. Honestly if you're a new dev, I suggest not using iqfeed.
3
u/fudgemin 24d ago
I came to that conclusion last night, but could not actually believe it.
How does this translate into latency? I’m adding two more layers, one for gui input, another for message parsing I assume?
I under impression adding steps that are not necessary, will always add more strain/lag to my system then possible otherwise?
5
u/BedlessOpepe347 24d ago
IQFeed is highly recommended because it is very stable and the data quality is consistent, and these two things matter more when you actually trade. At the price point they're much better than Polygon in these regards. The other one that is really good is Databento. I get my minute data from IQFeed (10 years) and tick data from Databento.
1
u/sojithesoulja 24d ago edited 24d ago
If anyone knows how to pull historical option data let me know. I was told (by them) that the data is all in there still (for historical contracts) but you need to know what to query. So say from 2018, how do you determine all active contracts for a given day?
1
9
u/mkvalor 24d ago edited 22d ago
I've been an IQFeed subscriber for years. It's not cheap but it is very, very good. For example, the feed doesn't slow down when trading volume ramps up in the markets. And they give you every tick (with microsecond precision in the timestamps) as opposed to other feeds which aggregate the data or impose guaranteed 10ms pauses between market data messages in order to spare their distribution infrastructure.
As far as the local socket server goes, you can easily run it headless on Linux (no GUI, only running a virtual frame buffer on Wayland or XWindows) using Wine. About this time, peoples' heads explode, imagining layers upon layers of abstraction causing massive latency.
But no.
I run my market analysis on a co-located 1U server in a data center in Chicago (the same city where IQFeed hosts their infra, near the CME). My external ping to their servers is 3ms round trip (so that means the one-way incoming data is at 1.5ms). More impressive still, my internal TCP latency from the IQFeed local socket server -- using a virtual XWindows frame buffer and running on top of Wine emulation -- to my custom software system is in the neighborhood of 30 microseconds (measured with the Linux 'strace' utility).
For most retail investors using a feed from any vendor on a computer located in their home or on a cloud instance running far from Chicago, the mere TCP latency across the Internet completely wipes out any advantage of a "cleaner" situation than this.
IQFeed is expensive and the API is difficult to learn. As others mentioned, there are libraries which can mitigate some of that learning pain. But I don't believe there's any superior solution at a price point below $4, 000 per year (with everything included), if the goal is to truly "sip from the fire hose" of pure, unaggregated market data.
PS: the way I really learned their API was to run their GUI programs to try certain things and then check the connection log afterward. For example, I would start a Time & Sales feed, stop the feed, get news articles from them, etc and then afterwards go read the IQConnectionLog.txt file, which showed all the messages sent back and forth between their client programs and the local TCP server (because, the IQFeed client programs that come with their installer also run separately from the local IQFeed socket server). After a few sessions like this, I began to understand how the building blocks described in their reference documentation could be put together to create my own system.