在python中使用字节(Work with bytes in python)
我必须获取字节数组并将其发送到套接字。
结构如下:1字节+ 2字节+2字节。
第一个字节是数字'5',第二个2个字节应该从变量中取
first
,第三个2个字节取自变量second
。 什么是在Python中做到这一点的正确方法?id = 5 # Fill as 1 byte first = 42 # Fill as 2 bytes second = 58 # The same as first
I have to get bytes array and send it into socket.
The structure looks like: 1 byte + 2 bytes + 2 bytes.
First byte is number '5', second 2 bytes should be taken from variable
first
, third 2 bytes should be taken from variablesecond
. What's the right way to do this in python?id = 5 # Fill as 1 byte first = 42 # Fill as 2 bytes second = 58 # The same as first
原文:https://stackoverflow.com/questions/12073115
最满意答案
我会添加一个名为
scan_number
的列,以便您可以将最新的扫描与先前的扫描进行比较。SELECT curr.file, prev.file, curr.DateModified, prev.DateModified FROM table1 curr LEFT JOIN table1 prev on curr.file = prev.file and curr.scan_number = 100 and prev.scan_number = 99 WHERE curr.DateModified != prev.DateModified OR curr.file IS NULL OR prev.file IS NULL
如果你想捕获插入和删除,你需要完整的外部连接,但似乎sqlite不支持。 您可能必须运行查询两次,一次查找插入和更新,一次查找删除。
I would add a column called
scan_number
so that you can compare the latest scan with the previous scan.SELECT curr.file, prev.file, curr.DateModified, prev.DateModified FROM table1 curr LEFT JOIN table1 prev on curr.file = prev.file and curr.scan_number = 100 and prev.scan_number = 99 WHERE curr.DateModified != prev.DateModified OR curr.file IS NULL OR prev.file IS NULL
If you want to catch inserts and deletes, you need full outer join, but it seems sqlite doesn't support that. You might have to run the query twice, once to find inserts and updates, and once to find deletes.
相关问答
更多-
你可以使用NOT IN : SELECT A.* FROM A WHERE ID NOT IN(SELECT ID FROM B) 但是,与此同时,我更喜欢NOT EXISTS : SELECT A.* FROM A WHERE NOT EXISTS(SELECT 1 FROM B WHERE B.ID=A.ID) 还有其他的选择,这篇文章很好地解释了所有的优点和缺点: 我是否应该使用NOT,OUTER APPLY,LEFT OUTER JOIN,EXCEPT或NOT EXISTS? You could ...
-
下列中不属于面向对象的编程语言的是?[2022-05-30]
a -
如何从其他表中不存在的表中选择所有记录?(How to select all records from one table that do not exist in another table?)[2021-12-01]
SELECT t1.name FROM table1 t1 LEFT JOIN table2 t2 ON t2.name = t1.name WHERE t2.name IS NULL 问 :这里发生了什么? A :从概念上讲,我们从table1选择所有的行,并且我们尝试在table2找到与name列相同的值的每一行。 如果没有这样的行,我们只需将该结果的table2部分留空。 然后我们通过仅选择匹配行不存在的结果中的那些行约束我们的选择。 最后,我们忽略我们结果中的所有字段,除了name列(我们确定存在 ... -
我会添加一个名为scan_number的列,以便您可以将最新的扫描与先前的扫描进行比较。 SELECT curr.file, prev.file, curr.DateModified, prev.DateModified FROM table1 curr LEFT JOIN table1 prev on curr.file = prev.file and curr.scan_number = 100 and prev.scan_number = 99 WHERE curr.DateModi ...
-
如果您只想要一个没有参加单个活动的用户列表,我们可以将用户表添加到交叉表C并完成。 但是,由于您想要报告所有事件,因此我们需要采用不同的方法。 一种选择是使用日历表来表示每个用户与每个事件的关系。 我们可以通过用户和事件表之间的交叉连接来生成此表。 然后,将此日历表格留在交接表C 。 SELECT t1.userID, t1.eventID FROM ( SELECT DISTINCT u.id AS userID, e.id AS eventID FROM users u CROS ...
-
没有看到你的数据,尝试这样的事情。 SELECT DB1.20180320.email FROM DB1.20180320 left join DB2.20180319 on DB1.20180320.email = DB2.20180319.email AND DB2.20180319.Status = 'active' WHERE DB2.20180319.email IS null; 这应显示DB1.20180320中DB2.20180319中不存在的所有 ...
-
考虑从B选择如下: SELECT ... FROM B LEFT JOIN E ... 通过执行LEFT JOIN您将获得B 所有行以及来自E 匹配行 。 请记住,您将获得多行B值,因为它们将存在于E每个匹配行。 数据可能如下所示: B.Field1 B.Field2 E.Field1 ============================================ 1 A NULL 2 ...
-
您的解决方案运行良好(只有一个请求),但它几乎是简单的SQL: bans = Ban.where("bans.id NOT IN (SELECT ban_id from ban_reason)") 您也可以尝试以下操作,并让rails完成部分工作: bans = Ban.where("bans.id NOT IN (?)", BanReason.select(:ban_id).map(&:ban_id).uniq) Your solution works great (only one request) ...
-
MySQL从表1中选择不存在连接的行的有效方法(MySQL efficient way to select rows from table 1 where join does not exist)[2022-07-10]
子查询可能会更快: SELECT title FROM items WHERE id NOT IN ( SELECT itemid FROM itemvotes ) 当您进行外连接时,它首先加入然后搜索,因此它搜索50,000 * 100,000行。 如果您执行子查询,则查看最多50,000 + 100,000。 当然,您还必须确保两个表都有适当的索引。 A sub-query will likely be faster: SELECT title FROM items WHERE id ... -
您可以查看generate_series ()函数。 然后使用except子句来获得差异。 select s.a from generate_series(
, ) as s(a) except select id from where --order by a 见SqlFiddle You may look at generate_series() function. Then use an except clause to get ...