使用 std::string 声明一个 std::bitset 的数组
Declaring an array of std::bitset using std::string
我目前正在尝试声明一个包含 17 个 std::bitsets 的数组,每个 32 位长。我是这样做的:
std::bitset<32> mTestInstruction[17]
{
std::string("01000000001000000000000000000001"),
std::string("01000000011000000000000001100011"),
std::string("01000000101000000000000000000001"),
std::string("10100000000000000000000000001010"),
std::string("00000000100000010000000010000010"),
std::string("00000000110001010010000000000001"),
std::string("01001000111001010000000000000000"),
std::string("01000100001000110000000000000011"),
std::string("01000000001000010000000000000001"),
std::string("10000000000000000000000000000011"),
std::string("00000000010000000000000000000001"),
std::string("00000000111000000000000000000001"),
std::string("00000000111001110000100000000001"),
std::string("01000000010000100000000000000001"),
std::string("01000100001000100000000000000010"),
std::string("10000000000000000000000000001100"),
std::string("11100000000000000000000000001000"),
};
我收到以下错误:
error: could not convert 'std::__cxx11::basic_string<char>(((const char*)"01000000001000000000000000000001"), std::allocator<char>())' from 'std::__cxx11::string {aka std::__cxx11::basic_string<char>}' to 'std::bitset<32u>'
对于每个位串。
我不明白为什么会这样,因为根据 cpp 参考,std::string 是构建位集的有效方法。
谁能指出如何解决这个问题?
您需要调用 std::bitset
的构造函数,例如:
std::bitset< 32 > mTestInstruction[17]
{
std::bitset< 32 >( std::string( "01000000001000000000000000000001" ) ),
std::bitset< 32 >( std::string( "01000000011000000000000001100011" ) ),
// ...
};
甚至更短:
std::bitset< 32 > mTestInstruction[17]
{
std::bitset< 32 >( "01000000001000000000000000000001" ),
std::bitset< 32 >( "01000000011000000000000001100011" ),
// ...
};
您的代码无法运行的原因是因为 std::bitset
接受 std::string
的构造函数被标记为显式(参见 here)。
从 C++14 开始,您可以使用二进制文字,例如:
std::bitset< 32 > mTestInstruction[17]
{
0b01000000001000000000000000000001,
0b01000000011000000000000001100011,
// ...
};
你可以这样做:
std::bitset<32> mTestInstruction[17]{
std::bitset<32>{std::string("01000000001000000000000000000001")},
std::bitset<32>{std::string("01000000011000000000000001100011")},
std::bitset<32>{std::string("01000000101000000000000000000001")},
std::bitset<32>{std::string("10100000000000000000000000001010")},
std::bitset<32>{std::string("00000000100000010000000010000010")},
std::bitset<32>{std::string("00000000110001010010000000000001")},
std::bitset<32>{std::string("01001000111001010000000000000000")},
std::bitset<32>{std::string("01000100001000110000000000000011")},
std::bitset<32>{std::string("01000000001000010000000000000001")},
std::bitset<32>{std::string("10000000000000000000000000000011")},
std::bitset<32>{std::string("00000000010000000000000000000001")},
std::bitset<32>{std::string("00000000111000000000000000000001")},
std::bitset<32>{std::string("00000000111001110000100000000001")},
std::bitset<32>{std::string("01000000010000100000000000000001")},
std::bitset<32>{std::string("01000100001000100000000000000010")},
std::bitset<32>{std::string("10000000000000000000000000001100")},
std::bitset<32>{std::string("11100000000000000000000000001000")},
};
我目前正在尝试声明一个包含 17 个 std::bitsets 的数组,每个 32 位长。我是这样做的:
std::bitset<32> mTestInstruction[17]
{
std::string("01000000001000000000000000000001"),
std::string("01000000011000000000000001100011"),
std::string("01000000101000000000000000000001"),
std::string("10100000000000000000000000001010"),
std::string("00000000100000010000000010000010"),
std::string("00000000110001010010000000000001"),
std::string("01001000111001010000000000000000"),
std::string("01000100001000110000000000000011"),
std::string("01000000001000010000000000000001"),
std::string("10000000000000000000000000000011"),
std::string("00000000010000000000000000000001"),
std::string("00000000111000000000000000000001"),
std::string("00000000111001110000100000000001"),
std::string("01000000010000100000000000000001"),
std::string("01000100001000100000000000000010"),
std::string("10000000000000000000000000001100"),
std::string("11100000000000000000000000001000"),
};
我收到以下错误:
error: could not convert 'std::__cxx11::basic_string<char>(((const char*)"01000000001000000000000000000001"), std::allocator<char>())' from 'std::__cxx11::string {aka std::__cxx11::basic_string<char>}' to 'std::bitset<32u>'
对于每个位串。
我不明白为什么会这样,因为根据 cpp 参考,std::string 是构建位集的有效方法。 谁能指出如何解决这个问题?
您需要调用 std::bitset
的构造函数,例如:
std::bitset< 32 > mTestInstruction[17]
{
std::bitset< 32 >( std::string( "01000000001000000000000000000001" ) ),
std::bitset< 32 >( std::string( "01000000011000000000000001100011" ) ),
// ...
};
甚至更短:
std::bitset< 32 > mTestInstruction[17]
{
std::bitset< 32 >( "01000000001000000000000000000001" ),
std::bitset< 32 >( "01000000011000000000000001100011" ),
// ...
};
您的代码无法运行的原因是因为 std::bitset
接受 std::string
的构造函数被标记为显式(参见 here)。
从 C++14 开始,您可以使用二进制文字,例如:
std::bitset< 32 > mTestInstruction[17]
{
0b01000000001000000000000000000001,
0b01000000011000000000000001100011,
// ...
};
你可以这样做:
std::bitset<32> mTestInstruction[17]{
std::bitset<32>{std::string("01000000001000000000000000000001")},
std::bitset<32>{std::string("01000000011000000000000001100011")},
std::bitset<32>{std::string("01000000101000000000000000000001")},
std::bitset<32>{std::string("10100000000000000000000000001010")},
std::bitset<32>{std::string("00000000100000010000000010000010")},
std::bitset<32>{std::string("00000000110001010010000000000001")},
std::bitset<32>{std::string("01001000111001010000000000000000")},
std::bitset<32>{std::string("01000100001000110000000000000011")},
std::bitset<32>{std::string("01000000001000010000000000000001")},
std::bitset<32>{std::string("10000000000000000000000000000011")},
std::bitset<32>{std::string("00000000010000000000000000000001")},
std::bitset<32>{std::string("00000000111000000000000000000001")},
std::bitset<32>{std::string("00000000111001110000100000000001")},
std::bitset<32>{std::string("01000000010000100000000000000001")},
std::bitset<32>{std::string("01000100001000100000000000000010")},
std::bitset<32>{std::string("10000000000000000000000000001100")},
std::bitset<32>{std::string("11100000000000000000000000001000")},
};