首页 \ 问答 \ C ++找到ofstream的大小(C++ find size of ofstream)

C ++找到ofstream的大小(C++ find size of ofstream)

我有代码,目前有如下所示:

ofstream fout;
fout.open("file.txt");
fout<<"blah blah "<<100<<","<<3.14;
//get ofstream length here
fout<<"write more stuff"<<endl;

有没有一种方便的方法来找出在上面指定的阶段写的那条线的长度? (在现实生活中,int 100和float 3.14不是恒定的,可以改变)。 有没有一种方法可以做我想做的事?

编辑:通过长度,我的意思是可以使用fseek,例如

fseek(pFile, -linelength, SEEK_END);

I have code that currently does something like the following:

ofstream fout;
fout.open("file.txt");
fout<<"blah blah "<<100<<","<<3.14;
//get ofstream length here
fout<<"write more stuff"<<endl;

Is there a convenient way to find out the length of that line that is written at the stage I specified above? (in real life, the int 100 and float 3.14 are not constant and can change). Is there a good way to do what I want?

EDIT: by length, I mean something that can be used using fseek, e.g.

fseek(pFile, -linelength, SEEK_END);

原文:https://stackoverflow.com/questions/16825055
更新时间:2022-07-27 17:07

最满意答案

GCS的回应:

我们的产品团队能够找到索引无法创建的原因。 看起来,在数据中,您有两个不唯一的条目,因此会创建唯一性违规[1]并阻止创建索引。 在索引尝试之前抛出此错误,这就是为什么索引创建失败,甚至未尝试索引。

您可以使用查询来查找重复条目:

SELECT column, count(column) 
FROM table
GROUP BY column
HAVING COUNT(column) > 1

您可以修改此查询以同时搜索所有密钥或在每次搜索后对其进行修改。 一旦完成了重复项目,您应该能够管理这些项目并再次运行索引创建。


我希望Spanner团队能够修复这个错误,并在未来的版本中返回正确的错误。


Response from GCS:

Our product team was able to find the reason the indexes are not able to be created. As it appears, within the data, you have two entries that are not unique and thus creates a uniqueness violation[1] and prevent the index from being created. This error is thrown prior to the index attempt and that is why the index creation fails before it is even attempted.

You can use a query to find the duplicate entries:

SELECT column, count(column) 
FROM table
GROUP BY column
HAVING COUNT(column) > 1

You can modify this query to search all keys at the same time or modify it after each search. Once the duplicates have been taken care of, you should be able to manage these entries and run the index creation again.


I hope the Spanner team can fix this bug and return the correct error in a future release.

相关问答

更多