Question:

What is the difference between %f and %g?

by Guest11028  |  earlier

0 LIKES UnLike

it is related to c-programming

 Tags:

   Report

1 ANSWERS


  1. %f is the floating point field descriptor.  It will put the value into the form mmm.ddddd.  If the number is too long to fit in the given field, you'll get the default characters instead.

    %g will use the %f format if it can.  If the number is too long, it will shift to the %e (scientific notation) format.

Question Stats

Latest activity: earlier.
This question has 1 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.