我使用websocket连接到第三方数据馈送提供商的服务器。对于websocket连接,我的代码是:
this.websocket = new WebSocket("wss://socket.polygon.io/stocks", sslProtocols: SslProtocols.Tls12 | SslProtocols.Tls11 | SslProtocols.Tls);
因此,当连接建立时,我们每分钟都会收到近70,000到1,00,000条记录。所以在那之后我们将这些响应分成两部分并将其存储在单个文件中。就像我们收到AAPL的数据一样,然后我们将这些数据存储到AAPL的文件中。与FB,MSFT,IBM,QQQ等相同。我们总共需要处理10,000个文件,并根据它存储实时记录。
public static string tempFile = @"D:\TempFileForLiveMarket\tempFileStoreLiveSymbols.txt";
public static System.IO.StreamWriter w;
private void websocket_MessageReceived(object sender, MessageReceivedEventArgs e)
{
using (w = System.IO.File.AppendText(tempFile))
{
Log(e.Message, w);
}
using (System.IO.StreamReader r = System.IO.File.OpenText(tempFile))
{
DumpLog(r);
}
}
public static void Log(string responseMessage, System.IO.TextWriter w)
{
w.WriteLine(responseMessage);
}
public static void DumpLog(System.IO.StreamReader r)
{
string line;
while ((line = r.ReadLine()) != null)
{
WriteRecord(line);
}
}
public static void WriteRecord(string data)
{
List<LiveData> ld = JsonConvert.DeserializeObject<List<LiveData>>(data);
var filterData = ld.Where(x => symbolList.Contains(x.sym));
List<string> fileLines = new List<string>();
foreach (var item in filterData)
{
var fileName = @"D:\SymbolsData\"+item.sym+ "_day_Aggregate.txt";
fileLines = File.ReadAllLines(fileName).AsParallel().Skip(1).ToList();
if (fileLines.Count > 1)
{
var lastLine = fileLines.Last();
if (!lastLine.Contains(item.sym))
{
fileLines.RemoveAt(fileLines.Count - 1);
}
}
fileLines.Add(item.sym + "," + item.s + "," + item.p + "-----");
System.IO.File.WriteAllLines(fileName, fileLines);
}
}
因此,当websocket连接建立并使用我们的10,000个文件对实时市场数据执行操作时,它变慢,并且几秒钟之后websocket连接也会关闭并传递如下消息:
Websocket Error
Received an unexpected EOF or 0 bytes from the transport stream.
Connection Closed...
我正在执行整个过程,因为在下一阶段我需要对每个符号的实时价格进行技术分析。那么我该如何处理这种情况呢?如何使处理速度快于处理速度?以及如何停止连接关闭?
编辑后
我用String Builder替换流编写器和临时文件,如下所示,
public static StringBuilder sb = new StringBuilder();
public static System.IO.StringWriter sw;
private void websocket_MessageReceived(object sender, MessageReceivedEventArgs e)
{
sw = new System.IO.StringWriter(sb);
sw.WriteLine(e.Message);
Reader();
}
public static void Reader()
{
System.IO.StringReader _sr = new System.IO.StringReader(sb.ToString());
while (_sr.Peek() > -1)
{
WriteRecord(sb.ToString());
}
sb.Remove(0, sb.Length);
}
public static void WriteRecord(string data)
{
List<LiveData> ld = JsonConvert.DeserializeObject<List<LiveData>>(data);
foreach (var item in filterData)
{
var fileName = @"D:\SymbolsData\"+item.sym+ "_day_Aggregate.txt";
fileLines = File.ReadAllLines(fileName).AsParallel().Skip(1).ToList();
fileLines.RemoveAt(fileLines.Count - 1);
fileLines.Add(item.sym + "," + item.s + "," + item.p)
System.IO.File.WriteAllLines(fileName, fileLines);
}
}
看起来你将每条消息附加到tempFile
,但随后你处理整个tempFile
。这意味着你不断地重新处理旧数据加上新记录,所以是的:它会逐渐变长,持续时间越来越长,直到需要很长时间才能让另一端厌倦等待,并切断你。我的建议:不要那样做。
在实际处理每条记录时,你还可以更有效地做很多事情,但与不断重新处理所有内容的开销相比,这是无关紧要的。