15-bit vs default 12-bit precision
Posted: Fri Jan 05, 2018 6:29
This post:
https://forums.eagle.ru/showpost.php?p= ... stcount=11
says that the default 12-bit precision can be changed to 15-bit using the configuration software.
My questions are:
(1) Is there any harm in setting all axes etc. to the maximum precision? Are there trade-offs in, e.g. processor load/performance?
(2) Why was the default set to 12-bit (especially if the answer to [1] is "no")?
Thanks!
https://forums.eagle.ru/showpost.php?p= ... stcount=11
says that the default 12-bit precision can be changed to 15-bit using the configuration software.
My questions are:
(1) Is there any harm in setting all axes etc. to the maximum precision? Are there trade-offs in, e.g. processor load/performance?
(2) Why was the default set to 12-bit (especially if the answer to [1] is "no")?
Thanks!