Google C++ Testing Framework AdvancedGuide

厉钊
2023-12-01
My>Sign in
googletest
Google C++ Testing Framework
 
Project Home  Downloads  Wiki  Issues  Source
Search 
  •  
  •     
  •  
  •  
  •       
  •      for   
  •  
    AdvancedGuide  
    Updated  Jul 23, 2013 by  w...@google.com

    r#summary Advanced>More Assertions

    • Explicit Success and Failure
    • Exception Assertions
    • Predicate Assertions for Better Error Messages
      • Using an Existing Boolean Function
      • Using a Function That Returns an AssertionResult
      • Using a Predicate-Formatter
    • Floating-Point Comparison
      • Floating-Point Macros
      • Floating-Point Predicate-Format Functions
    • Windows HRESULT assertions
    • Type Assertions
    • Assertion Placement
    • Teaching Google Test How to Print Your Values
    • Death Tests
      • How to Write a Death Test
      • Regular Expression Syntax
      • How It Works
      • Death Tests And Threads
      • Death Test Styles
      • Caveats
    • Using Assertions in Sub-routines
      • Adding Traces to Assertions
      • Propagating Fatal Failures
        • Asserting on Subroutines
        • Checking for Failures in the Current Test
    • Logging Additional Information
    • Sharing Resources Between Tests in the Same Test Case
    • Global Set-Up and Tear-Down
    • Value Parameterized Tests
      • How to Write Value-Parameterized Tests
      • Creating Value-Parameterized Abstract Tests
    • Typed Tests
    • Type-Parameterized Tests
    • Testing Private Code
      • Static Functions
      • Private Class Members
    • Catching Failures
    • Getting the Current Test's Name
    • Extending Google Test by Handling Test Events
      • Defining Event Listeners
      • Using Event Listeners
      • Generating Failures in Listeners
    • Running Test Programs: Advanced Options
      • Selecting Tests
        • Listing Test Names
        • Running a Subset of the Tests
        • Temporarily Disabling Tests
        • Temporarily Enabling Disabled Tests
      • Repeating the Tests
      • Shuffling the Tests
      • Controlling Test Output
        • Colored Terminal Output
        • Suppressing the Elapsed Time
        • Generating an XML Report
      • Controlling How Failures Are Reported
        • Turning Assertion Failures into Break-Points
        • Disabling Catching Test-Thrown Exceptions
        • Letting Another Testing Framework Drive
      • Distributing Test Functions to Multiple Machines
    • Fusing Google Test Source Files
    • Where to Go from Here
    • Now>Primer and>More Assertions

      This>Explicit Success and Failure

      These>

    SUCCEED();

    Generates> FAIL(); ADD_FAILURE(); ADD_FAILURE_AT("file_path", line_number);

    FAIL() generates>switch(expression) { case 1: ... some checks ... case 2: ... some other checks ... default: FAIL() << "We shouldn't get here."; }

    Availability: Linux, Windows, Mac.

    Exception Assertions

    These> Fatal assertion Nonfatal assertion Verifies ASSERT_THROW(statementexception_type); EXPECT_THROW(statementexception_type); statement throws an exception of the given type ASSERT_ANY_THROW(statement); EXPECT_ANY_THROW(statement); statement throws an exception of any type ASSERT_NO_THROW(statement); EXPECT_NO_THROW(statement); statement doesn't>ASSERT_THROW(Foo(5), bar_exception); EXPECT_NO_THROW({ int n = 5; Bar(&n); });

    Availability: Linux, Windows, Mac;>Predicate Assertions for Better Error Messages

    Even>Using an Existing Boolean Function

    If> Fatal assertion Nonfatal assertion Verifies ASSERT_PRED1(pred1, val1); EXPECT_PRED1(pred1, val1); pred1(val1) returns true ASSERT_PRED2(pred2, val1, val2); EXPECT_PRED2(pred2, val1, val2); pred2(val1, val2) returns true ... ... ...

    In>// Returns true iff m and n have no common divisors except 1. bool MutuallyPrime(int m, int n) { ... } const int a = 3; const int b = 4; const int c = 10;

    the>this for >

    Availability: Linux, Windows, Mac

    Using a Function That Returns an AssertionResult

    While EXPECT_PRED*() and>namespace testing { // Returns an AssertionResult object to indicate that an assertion has // succeeded. AssertionResult AssertionSuccess(); // Returns an AssertionResult object to indicate that an assertion has // failed. AssertionResult AssertionFailure(); }

    You>::testing::AssertionResult IsEven(int n) { if ((n % 2) == 0) return ::testing::AssertionSuccess(); else return ::testing::AssertionFailure() << n << " is odd"; }

    instead>bool IsEven(int n) { return (n % 2) == 0; }

    the>::testing::AssertionResult IsEven(int n) { if ((n % 2) == 0) return ::testing::AssertionSuccess() << n << " is even"; else return ::testing::AssertionFailure() << n << " is odd"; }

    Then>Using a Predicate-Formatter

    If> Fatal assertion Nonfatal assertion Verifies ASSERT_PRED_FORMAT1(pred_format1, val1); EXPECT_PRED_FORMAT1(pred_format1, val1`); pred_format1(val1) is successful ASSERT_PRED_FORMAT2(pred_format2, val1, val2); EXPECT_PRED_FORMAT2(pred_format2, val1, val2); pred_format2(val1, val2) is successful ... ... ...

    The>// Returns the smallest prime common divisor of m and n, // or 1 when m and n are mutually prime. int SmallestPrimeCommonDivisor(int m, int n) { ... } // A predicate-formatter for asserting that two integers are mutually prime. ::testing::AssertionResult AssertMutuallyPrime(const char* m_expr, const char* n_expr, int m, int n) { if (MutuallyPrime(m, n)) return ::testing::AssertionSuccess(); return ::testing::AssertionFailure() << m_expr << " and " << n_expr << " (" << m << " and " << n << ") are not mutually prime, " << "as they have a common divisor " << SmallestPrimeCommonDivisor(m, n); }

    With>EXPECT_PRED_FORMAT2(AssertMutuallyPrime, b, c);

    to>Floating-Point Comparison

    Comparing>this article on float comparison.

    Floating-Point Macros

    Fatal assertionNonfatal assertionVerifies
    ASSERT_FLOAT_EQ(expected, actual);EXPECT_FLOAT_EQ(expected, actual);the two float values are almost equal
    ASSERT_DOUBLE_EQ(expected, actual);EXPECT_DOUBLE_EQ(expected, actual);the two double values are almost equal

    By "almost equal", we mean the two values are within 4 ULP's from each other.

    The following assertions allow you to choose the acceptable error bound:

    Fatal assertionNonfatal assertionVerifies
    ASSERT_NEAR(val1, val2, abs_error);EXPECT_NEAR(val1, val2, abs_error);the difference between val1 and val2 doesn't>Floating-Point Predicate-Format Functions

    Some>EXPECT_PRED_FORMAT2(::testing::FloatLE, val1, val2); EXPECT_PRED_FORMAT2(::testing::DoubleLE, val1, val2);

    Verifies>Windows HRESULT assertions

    These assertions test for HRESULT success or failure.

    Fatal assertionNonfatal assertionVerifies
    ASSERT_HRESULT_SUCCEEDED(expression);EXPECT_HRESULT_SUCCEEDED(expression);expression is a success HRESULT
    ASSERT_HRESULT_FAILED(expression);EXPECT_HRESULT_FAILED(expression);expression is>CComPtr shell; ASSERT_HRESULT_SUCCEEDED(shell.CoCreateInstance(L"Shell.Application")); CComVariant empty; ASSERT_HRESULT_SUCCEEDED(shell->ShellExecute(CComBSTR(url), empty, empty, empty, empty));

    Availability: Windows.

    Type Assertions

    You>::testing::StaticAssertTypeEq<T1, T2>();

    to>template <typename T> class Foo { public: void Bar() { ::testing::StaticAssertTypeEq<int, T>(); } };

    the>void Test1() { Foo<bool> foo; }

    will not generate>void Test2() { Foo<bool> foo; foo.Bar(); }

    to>Assertion Placement

    You>Teaching Google Test How to Print Your Values

    When>#include <iostream> namespace foo { class Bar { ... }; // We want Google Test to be able to print instances of this. // It's important that the << operator is defined in the SAME // namespace that defines Bar. C++'s look-up rules rely on that. ::std::ostream& operator<<(::std::ostream& os, const Bar& bar) { return os << bar.DebugString(); // whatever needed to print bar to os } } // namespace foo

    Sometimes,>#include <iostream> namespace foo { class Bar { ... }; // It's important that PrintTo() is defined in the SAME // namespace that defines Bar. C++'s look-up rules rely on that. void PrintTo(const Bar& bar, ::std::ostream* os) { *os << bar.DebugString(); // whatever needed to print bar to os } } // namespace foo

    If>vector<pair<Bar, int> > bar_ints = GetBarIntVector(); EXPECT_TRUE(IsCorrectBarIntVector(bar_ints)) << "bar_ints = " << ::testing::PrintToString(bar_ints);

    Death Tests

    In many applications, there are assertions that can cause application failure if a condition is not met. These sanity checks, which ensure that the program is in a known good state, are there to fail at the earliest possible time after some program state is corrupted. If the assertion checks the wrong condition, then the program may proceed in an erroneous state, which could lead to memory corruption, security holes, or worse. Hence it is vitally important to test that such assertion statements work as expected.

    Since these precondition checks cause the processes to die, we call such tests death tests. More generally, any test that checks that a program terminates (except by throwing an exception) in an expected fashion is also a death test.

    Note that if a piece of code throws an exception, we don't consider it "death" for the purpose of death tests, as the caller of the code could catch the exception and avoid the crash. If you want to verify exceptions thrown by your code, see Exception Assertions.

    If you want to test EXPECT_*()/ASSERT_*() failures in your test code, see Catching Failures.

    How to Write a Death Test

    Google Test has the following macros to support death tests:

    Fatal assertionNonfatal assertionVerifies
    ASSERT_DEATH(statement, regex`);EXPECT_DEATH(statement, regex`);statement crashes with the given error
    ASSERT_DEATH_IF_SUPPORTED(statement, regex`);EXPECT_DEATH_IF_SUPPORTED(statement, regex`);if death tests are supported, verifies that statement crashes with the given error; otherwise verifies nothing
    ASSERT_EXIT(statement, predicate, regex`);EXPECT_EXIT(statement, predicate, regex`);statement exits>::testing::ExitedWithCode(exit_code)

    This>::testing::KilledBySignal(signal_number) // Not available on Windows.

    This>TEST(MyDeathTest, Foo) { // This death test uses a compound statement. ASSERT_DEATH({ int n = 5; Foo(&n); }, "Error on line .* of Foo()"); } TEST(MyDeathTest, NormalExit) { EXPECT_EXIT(NormalExit(), ::testing::ExitedWithCode(0), "Success"); } TEST(MyDeathTest, KillMyself) { EXPECT_EXIT(KillMyself(), ::testing::KilledBySignal(SIGKILL), "Sending myself unblockable signal"); }

    verifies that:

    • calling Foo(5) causes the process to die with the given error message,
    • calling NormalExit() causes the process to print "Success" to stderr and exit with exit code 0, and
    • calling KillMyself() kills the process with signal SIGKILL.

    The test function body may contain other assertions and statements as well, if necessary.

    Important: We strongly recommend you to follow the convention of naming your test case (not test) *DeathTest when it contains a death test, as demonstrated in the above example. The Death Tests And Threads section below explains why.

    If a test fixture class is shared by normal tests and death tests, you can use typedef to introduce an alias for the fixture class and avoid duplicating its>class FooTest : public ::testing::Test { ... }; typedef FooTest FooDeathTest; TEST_F(FooTest, DoesThis) { // normal test } TEST_F(FooDeathTest, DoesThat) { // death test }

    Availability: Linux, Windows (requires MSVC 8.0>Regular Expression Syntax

    On POSIX>POSIX extended regular expression syntax>Wikipedia entry.

    On Windows, Google Test uses its own simple regular expression implementation. It lacks many features you can find in POSIX extended regular expressions. For example, we don't support union ("x|y"), grouping ("(xy)"), brackets ("[xy]"), and repetition count ("x{5,7}"), among others. Below is what we do support (Letter Adenotes a literal character, period (.), or a single \\ escape sequence; x and y denote regular expressions.):

    cmatches any literal character c
    \\dmatches any decimal digit
    \\Dmatches any character that's not a decimal digit
    \\fmatches \f
    \\nmatches \n
    \\rmatches \r
    \\smatches any ASCII whitespace, including \n
    \\Smatches any character that's not a whitespace
    \\tmatches \t
    \\vmatches \v
    \\wmatches any letter, _, or decimal digit
    \\Wmatches any character that \\w doesn't match
    \\cmatches any literal character c, which must be a punctuation
    \\.matches the . character
    .matches any single character except \n
    A?matches 0 or 1 occurrences of A
    A*matches 0 or many occurrences of A
    A+matches 1 or many occurrences of A
    ^matches the beginning of a string (not that of each line)
    $matches the end of a string (not that of each line)
    xymatches x followed> when it uses POSIX extended regular expressions, orGTEST_USES_SIMPLE_RE=1 when>How It Works

    Under>Death Tests And Threads

    The>Death Test Styles

    The "threadsafe">::testing::FLAGS_gtest_death_test_style = "threadsafe";

    You>TEST(MyDeathTest, TestOne) { ::testing::FLAGS_gtest_death_test_style = "threadsafe"; // This test is run in the "threadsafe" style: ASSERT_DEATH(ThisShouldDie(), ""); } TEST(MyDeathTest, TestTwo) { // This test is run in the "fast" style: ASSERT_DEATH(ThisShouldDie(), ""); } int main(int argc, char** argv) { ::testing::InitGoogleTest(&argc, argv); ::testing::FLAGS_gtest_death_test_style = "fast"; return RUN_ALL_TESTS(); }

    Caveats

    The statement argument>Using Assertions in Sub-routines

    Adding Traces to Assertions

    If a test sub-routine is called from several places, when an assertion inside it fails, it can be hard to tell which invocation of the sub-routine the failure is from. You can alleviate this problem using extra logging or custom failure messages, but that usually clutters up your tests. A better solution is to use theSCOPED_TRACE macro:

    SCOPED_TRACE(message);

    where message can>10: void Sub1(int n) { 11: EXPECT_EQ(1, Bar(n)); 12: EXPECT_EQ(2, Bar(n + 1)); 13: } 14: 15: TEST(FooTest, Bar) { 16: { 17: SCOPED_TRACE("A"); // This trace point will be included in 18: // every failure in this scope. 19: Sub1(1); 20: } 21: // Now it won't. 22: Sub1(9); 23: }

    could>path/to/foo_test.cc:11: Failure Value of: Bar(n) Expected: 1 Actual: 2 Trace: path/to/foo_test.cc:17: A path/to/foo_test.cc:12: Failure Value of: Bar(n + 1) Expected: 2 Actual: 3

    Without>Propagating Fatal Failures

    A>void Subroutine() { // Generates a fatal failure and aborts the current function. ASSERT_EQ(1, 2); // The following won't be executed. ... } TEST(FooTest, Bar) { Subroutine(); // The intended behavior is for the fatal failure // in Subroutine() to abort the entire test. // The actual behavior: the function goes on after Subroutine() returns. int* p = NULL; *p = 3; // Segfault! }

    Since>Asserting on Subroutines

    As shown above, if your test calls a subroutine that has an ASSERT_* failure in it, the test will continue after the subroutine returns. This may not be what you want.

    Often people want fatal failures to propagate like exceptions. For that Google Test offers the following macros:

    Fatal assertionNonfatal assertionVerifies
    ASSERT_NO_FATAL_FAILURE(statement);EXPECT_NO_FATAL_FAILURE(statement);statement doesn't>ASSERT_NO_FATAL_FAILURE(Foo()); int i; EXPECT_NO_FATAL_FAILURE({ i = Bar(); });

    Availability: Linux, Windows, Mac. Assertions>Checking for Failures in the Current Test

    HasFatalFailure() in>class Test { public: ... static bool HasFatalFailure(); };

    The>TEST(FooTest, Bar) { Subroutine(); // Aborts if Subroutine() had a fatal failure. if (HasFatalFailure()) return; // The following won't be executed. ... }

    If HasFatalFailure() is>if (::testing::Test::HasFatalFailure()) return;

    Similarly, HasNonfatalFailure() returns true if>Logging Additional Information

    In>TEST_F(WidgetUsageTest, MinAndMaxWidgets) { RecordProperty("MaximumWidgets", ComputeMaxUsage()); RecordProperty("MinimumWidgets", ComputeMinUsage()); }

    will>... <testcase name="MinAndMaxWidgets" status="run" time="6" classname="WidgetUsageTest" MaximumWidgets="12" MinimumWidgets="9" /> ...

    Note:

    • RecordProperty() is>Sharing Resources Between Tests in the Same Test Case

      Google Test>class FooTest : public ::testing::Test { protected: // Per-test-case set-up. // Called before the first test in this test case. // Can be omitted if not needed. static void SetUpTestCase() { shared_resource_ = new ...; } // Per-test-case tear-down. // Called after the last test in this test case. // Can be omitted if not needed. static void TearDownTestCase() { delete shared_resource_; shared_resource_ = NULL; } // You can define per-test set-up and tear-down logic as usual. virtual void SetUp() { ... } virtual void TearDown() { ... } // Some expensive resource shared by all tests. static T* shared_resource_; }; T* FooTest::shared_resource_ = NULL; TEST_F(FooTest, Test1) { ... you can refer to shared_resource here ... } TEST_F(FooTest, Test2) { ... you can refer to shared_resource here ... }

      Availability: Linux, Windows, Mac.

      Global Set-Up and Tear-Down

      Just>class Environment { public: virtual ~Environment() {} // Override this to define how to set up the environment. virtual void SetUp() {} // Override this to define how to tear down the environment. virtual void TearDown() {} };

      Then,>Environment* AddGlobalTestEnvironment(Environment* env);

      Now,>::testing::Environment* const foo_env = ::testing::AddGlobalTestEnvironment(new FooEnvironment);

      However,>Value Parameterized Tests

      Value-parameterized>TEST(MyCodeTest, TestFoo) { // A code to test foo(). }

      Usually>void TestFooHelper(bool flag_value) { flag = flag_value; // A code to test foo(). } TEST(MyCodeTest, TestFoo) { TestFooHelper(false); TestFooHelper(true); }

      But>How to Write Value-Parameterized Tests

      To>class FooTest : public ::testing::TestWithParam<const char*> { // You can implement all the usual fixture class members here. // To access the test parameter, call GetParam() from class // TestWithParam<T>. }; // Or, when you want to add parameters to a pre-existing fixture class: class BaseTest : public ::testing::Test { ... }; class BarTest : public BaseTest, public ::testing::WithParamInterface<const char*> { ... };

      Then,>TEST_P(FooTest, DoesBlah) { // Inside a test, access the test parameter with the GetParam() method // of the TestWithParam<T> class: EXPECT_TRUE(foo.Blah(GetParam())); ... } TEST_P(FooTest, HasBlahBlah) { ... }

      Finally, you can use INSTANTIATE_TEST_CASE_P to instantiate the test case with any set of parameters you want. Google Test defines a number of functions for generating test parameters. They return what we call (surprise!) parameter generators. Here is a summary of them, which are all in the testing namespace:

      Range(begin, end[, step])Yields values {begin, begin+step, begin+step+step, ...}. The values do not include endstep defaults to 1.
      Values(v1, v2, ..., vN)Yields values {v1, v2, ..., vN}.
      ValuesIn(container)andValuesIn(begin, end)Yields values from a C-style array, an STL-style container, or an iterator range [begin, end)containerbegin, and end can be expressions whose values are determined at run time.
      Bool()Yields sequence {false, true}.
      Combine(g1, g2, ..., gN)Yields all combinations (the Cartesian product for the math savvy) of the values generated by the N generators. This>. See comments in include/gtest/internal/gtest-port.h for >source code.

      The>INSTANTIATE_TEST_CASE_P(InstantiationName, FooTest, ::testing::Values("meeny", "miny", "moe"));

      To distinguish different instances of the pattern (yes, you can instantiate it more than once), the first argument to INSTANTIATE_TEST_CASE_P is a prefix that will be added to the actual test case name. Remember to pick unique prefixes for different instantiations. The tests from the instantiation above will have these names:

      • InstantiationName/FooTest.DoesBlah/0 for "meeny"
      • InstantiationName/FooTest.DoesBlah/1 for "miny"
      • InstantiationName/FooTest.DoesBlah/2 for "moe"
      • InstantiationName/FooTest.HasBlahBlah/0 for "meeny"
      • InstantiationName/FooTest.HasBlahBlah/1 for "miny"
      • InstantiationName/FooTest.HasBlahBlah/2 for "moe"

      You can use these names in --gtest_filter.

      This>const char* pets[] = {"cat", "dog"}; INSTANTIATE_TEST_CASE_P(AnotherInstantiationName, FooTest, ::testing::ValuesIn(pets));

      The>these files for >Creating Value-Parameterized Abstract Tests

      In>Typed Tests

      Suppose>template <typename T> class FooTest : public ::testing::Test { public: ... typedef std::list<T> List; static T shared_; T value_; };

      Next,>typedef ::testing::Types<char, int, unsigned int> MyTypes; TYPED_TEST_CASE(FooTest, MyTypes);

      The typedef is>TYPED_TEST(FooTest, DoesBlah) { // Inside a test, refer to the special name TypeParam to get the type // parameter. Since we are inside a derived class template, C++ requires // us to visit the members of FooTest via 'this'. TypeParam n = this->value_; // To visit static members of the fixture, add the 'TestFixture::' // prefix. n += TestFixture::shared_; // To refer to typedefs in the fixture, add the 'typename TestFixture::' // prefix. The 'typename' is required to satisfy the compiler. typename TestFixture::List values; values.push_back(n); ... } TYPED_TEST(FooTest, HasPropertyA) { ... }

      You>Type-Parameterized Tests

      Type-parameterized>template <typename T> class FooTest : public ::testing::Test { ... };

      Next,>TYPED_TEST_CASE_P(FooTest);

      The _P suffix>TYPED_TEST_P(FooTest, DoesBlah) { // Inside a test, refer to TypeParam to get the type parameter. TypeParam n = 0; ... } TYPED_TEST_P(FooTest, HasPropertyA) { ... }

      Now>REGISTER_TYPED_TEST_CASE_P(FooTest, DoesBlah, HasPropertyA);

      Finally,>typedef ::testing::Types<char, int, unsigned int> MyTypes; INSTANTIATE_TYPED_TEST_CASE_P(My, FooTest, MyTypes);

      To>INSTANTIATE_TYPED_TEST_CASE_P(My, FooTest, int);

      You>Testing Private Code

      If>Static Functions

      Both>Private Class Members

      Private>FRIEND_TEST(TestCaseName, TestName);

      For>// foo.h #include "gtest/gtest_prod.h" // Defines FRIEND_TEST. class Foo { ... private: FRIEND_TEST(FooTest, BarReturnsZeroOnNull); int Bar(void* x); }; // foo_test.cc ... TEST(FooTest, BarReturnsZeroOnNull) { Foo foo; EXPECT_EQ(0, foo.Bar(NULL)); // Uses Foo's private member Bar(). }

      Pay>namespace my_namespace { class Foo { friend class FooTest; FRIEND_TEST(FooTest, Bar); FRIEND_TEST(FooTest, Baz); ... definition of the class Foo ... }; } // namespace my_namespace

      Your>namespace my_namespace { class FooTest : public ::testing::Test { protected: ... }; TEST_F(FooTest, Bar) { ... } TEST_F(FooTest, Baz) { ... } } // namespace my_namespace

      Catching Failures

      If you are building a testing utility on top of Google Test, you'll want to test your utility. What framework would you use to test it? Google Test, of course.

      The challenge is to verify that your testing utility reports failures correctly. In frameworks that report a failure by throwing an exception, you could catch the exception and assert on it. But Google Test doesn't use exceptions, so how do we test that a piece of code generates an expected failure?

      "gtest/gtest-spi.h" contains some constructs to do this. After #including this header, you can use

      EXPECT_FATAL_FAILURE(statement, substring);

      to assert that statement generates a fatal (e.g. ASSERT_*) failure whose message contains the given substring, or use

      EXPECT_NONFATAL_FAILURE(statement, substring);

      if you are expecting a non-fatal (e.g. EXPECT_*) failure.

      For technical reasons, there are some caveats:

      1. You cannot stream a failure message to either macro.
      2. statement in EXPECT_FATAL_FAILURE() cannot reference local non-static variables or non-static members of this object.
      3. statement in EXPECT_FATAL_FAILURE() cannot return a value.

      Note: Google Test is designed with threads in mind. Once the synchronization primitives in "gtest/internal/gtest-port.h" have been implemented, Google Test will become thread-safe, meaning that you can then use assertions in multiple threads concurrently. Before

      that, however, Google Test only supports single-threaded usage. Once thread-safe, EXPECT_FATAL_FAILURE() and EXPECT_NONFATAL_FAILURE() will capture failures in the current thread only. If statement creates new threads, failures in these threads will be ignored. If you want to capture failures from all threads instead, you should use the following macros:

      EXPECT_FATAL_FAILURE_ON_ALL_THREADS(statement, substring);
      EXPECT_NONFATAL_FAILURE_ON_ALL_THREADS(statement,>Getting the Current Test's Name

      Sometimes>namespace testing { class TestInfo { public: // Returns the test case name and the test name, respectively. // // Do NOT delete or free the return value - it's managed by the // TestInfo class. const char* test_case_name() const; const char* name() const; }; } // namespace testing

      To>// Gets information about the currently running test. // Do NOT delete the returned object - it's managed by the UnitTest class. const ::testing::TestInfo* const test_info = ::testing::UnitTest::GetInstance()->current_test_info(); printf("We are in test %s of test case %s.\n", test_info->name(), test_info->test_case_name());

      current_test_info() returns>Extending Google Test by Handling Test Events

      Google Test>Defining Event Listeners

      To>testing::TestEventListener or testing::EmptyTestEventListener. The>UnitTest reflects>TestCase has>TestInfo contains>TestPartResult represents> class MinimalistPrinter : public ::testing::EmptyTestEventListener { // Called before a test starts. virtual void OnTestStart(const ::testing::TestInfo& test_info) { printf("*** Test %s.%s starting.\n", test_info.test_case_name(), test_info.name()); } // Called after a failed assertion or a SUCCEED() invocation. virtual void OnTestPartResult( const ::testing::TestPartResult& test_part_result) { printf("%s in %s:%d\n%s\n", test_part_result.failed() ? "*** Failure" : "Success", test_part_result.file_name(), test_part_result.line_number(), test_part_result.summary()); } // Called after a test ends. virtual void OnTestEnd(const ::testing::TestInfo& test_info) { printf("*** Test %s.%s ending.\n", test_info.test_case_name(), test_info.name()); } };

      Using Event Listeners

      To>TestEventListeners - >int main(int argc, char** argv) { ::testing::InitGoogleTest(&argc, argv); // Gets hold of the event listener list. ::testing::TestEventListeners& listeners = ::testing::UnitTest::GetInstance()->listeners(); // Adds a listener to the end. Google Test takes the ownership. listeners.Append(new MinimalistPrinter); return RUN_ALL_TESTS(); }

      There's> ... delete listeners.Release(listeners.default_result_printer()); listeners.Append(new MinimalistPrinter); return RUN_ALL_TESTS();

      Now,>sample.

      You>Generating Failures in Listeners

      You>here.

      Running Test Programs: Advanced Options

      Google Test>int main(int argc, char** argv) { // Disables elapsed time by default. ::testing::GTEST_FLAG(print_time) = false; // This allows the user to override the flag on the command line. ::testing::InitGoogleTest(&argc, argv); return RUN_ALL_TESTS(); }

      Selecting Tests

      This>Listing Test Names

      Sometimes>TestCase1. TestName1 TestName2 TestCase2. TestName

      None>Running a Subset of the Tests

      By> Also runs everything, due to the single match-everything * value.

      • ./foo_test --gtest_filter=FooTest.* Runs> Runs any test whose full name contains either "Null" or "Constructor".
      • ./foo_test --gtest_filter=-*DeathTest.* Runs> Runs everything in test case FooTest except FooTest.Bar.
      • Availability: Linux, Windows, Mac.

        Temporarily Disabling Tests

        If>// Tests that Foo does Abc. TEST(FooTest, DISABLED_DoesAbc) { ... } class DISABLED_BarTest : public ::testing::Test { ... }; // Tests that Bar does Xyz. TEST_F(DISABLED_BarTest, DoesXyz) { ... }

        Note: This>Temporarily Enabling Disabled Tests

        To include disabled tests in test execution, just invoke the test program with the --gtest_also_run_disabled_tests flag or set the GTEST_ALSO_RUN_DISABLED_TESTS environment variable to a value other than 0. You can combine this with the --gtest_filter flag>Repeating the Tests

        Once in a while you'll run into a test whose result is hit-or-miss. Perhaps it will fail only 1% of the time, making it rather hard to reproduce the bug under a debugger. This can be a major source of frustration.

        The --gtest_repeat flag allows you to repeat all (or selected) test methods in a program many times. Hopefully, a flaky test will eventually fail and give you a chance to debug. Here's how to use it:

        $>Repeat foo_test 1000 times and don't stop at failures.
        $>A negative count means repeating forever.
        $>Repeat foo_test 1000 times, stopping at the first failure. This is especially useful when running under a debugger: when the testfails, it will drop into the debugger and you can then inspect variables and stacks.
        $>Repeat the tests whose name matches the filter 1000 times.

        If>Shuffling the Tests

        You> flag (or set the GTEST_RANDOM_SEED environment>, Google Test will pick a different random seed and re-shuffle the tests in each iteration.

        Availability: Linux, Windows, Mac;>Controlling Test Output

        This>Colored Terminal Output

        Google Test>Suppressing the Elapsed Time

        By> command line flag. Setting theGTEST_PRINT_TIME environment>Generating an XML Report

        Google Test>Hudson. Since><testsuites name="AllTests" ...> <testsuite name="test_case_name" ...> <testcase name="test_name" ...> <failure message="..."/> <failure message="..."/> <failure message="..."/> </testcase> </testsuite> </testsuites>

        • The>TEST(MathTest, Addition) { ... } TEST(MathTest, Subtraction) { ... } TEST(LogicTest, NonContradiction) { ... }

          could><?xml version="1.0" encoding="UTF-8"?> <testsuites tests="3" failures="1" errors="0" time="35" name="AllTests"> <testsuite name="MathTest" tests="2" failures="1" errors="0" time="15"> <testcase name="Addition" status="run" time="7" classname=""> <failure message="Value of: add(1, 1)&#x0A; Actual: 3&#x0A;Expected: 2" type=""/> <failure message="Value of: add(1, -1)&#x0A; Actual: 1&#x0A;Expected: 0" type=""/> </testcase> <testcase name="Subtraction" status="run" time="5" classname=""> </testcase> </testsuite> <testsuite name="LogicTest" tests="1" failures="0" errors="0" time="5"> <testcase name="NonContradiction" status="run" time="5" classname=""> </testcase> </testsuite> </testsuites>

          Things>Controlling How Failures Are Reported

          Turning Assertion Failures into Break-Points

          When>Disabling Catching Test-Thrown Exceptions

          Google Test> flag when running the tests.

          Availability: Linux, Windows, Mac.

          Letting Another Testing Framework Drive

          If>#include "gtest/gtest.h" int main(int argc, char** argv) { ::testing::GTEST_FLAG(throw_on_failure) = true; // Important: Google Test must be initialized. ::testing::InitGoogleTest(&argc, argv); ... whatever your existing testing framework requires ... }

          With>void TestFooDoesBar() { Foo foo; EXPECT_LE(foo.Bar(1), 100); // A Google Test assertion. CPPUNIT_ASSERT(foo.IsEmpty()); // A native assertion. }

          If> in your main(),>Distributing Test Functions to Multiple Machines

          If>TEST(A, V) TEST(A, W) TEST(B, X) TEST(B, Y) TEST(B, Z)

          and>Fusing Google Test Source Files

          Google Test's>python fuse_gtest_files.py OUTPUT_DIR

          and>scripts/test/Makefile file>Where to Go from Here

          Congratulations! You've>Frequently-Asked Questions.

     类似资料:

    相关阅读

    相关文章

    相关问答