

Mentor Graphics for their Precision synthesis Highest reader/customer satisfaction with synthesis tool’s analysis capability: Synplify Pro also got the highest rating we measured in any category with its “ease-of-use” score. You told us that Synplicity’s Synplify-Pro clearly set the standard in five of the six categories we measured. Synplicity for their Synplify-Pro synthesis tool Highest reader/customer satisfaction with a synthesis tool’s performance, ease-of-use, reliability, quality of results, and language support: You told us that ModelSim “continues to improve for FPGA use” and is “consistently of high value in debugging” your designs. This year, ModelSim swept all categories in reader satisfaction.
#SYNPLIFY PRO REVIEWS SIMULATOR#
Mentor Graphics for their ModelSim simulator Highest reader/customer satisfaction with an EDA vendor’s HDL simulator performance, ease-of-use and reliability:
#SYNPLIFY PRO REVIEWS HOW TO#
You told us that Xilinx continues to lead the pack in performance and features, and goes the extra mile in “explaining how to use their devices for particular class of application.” Xilinx for their Virtex and Spartan FPGAs Highest reader/customer satisfaction with an FPGA vendor’s devices: Once again, you told us that Xilinx consistently sets the standard for support staff and resources, particularly with their “robust website” and ”responsive and knowledgeable application engineers.” Xilinx for their support staff, application engineers, Highest reader/customer satisfaction with an FPGA vendor’s support: You told us that Xilinx has made “significant improvements” to their tool suite over the past year, particularly in the DSP and embedded design areas. Highest reader/customer satisfaction with an FPGA vendor’s tools: So, once again, still with no comedians or celebrities to read the results and completely devoid of little gold statues, are the winners of FPGA Journal’s second annual reader’s choice awards: To the best of our ability, we certify the following results as accurate within our survey samples. We double-dog checked the answers (even, cleverly, the IP addresses, e-mail domains and other data that needed to match) to be sure that there were no sneaky faux FPGA users trying to skew the sanctity of our results. We wanted the little guys with just a few customers to have just as good an opportunity to win as some of the big companies with hundreds of users responding. (A user of vendor A’s tools was not allowed to rate vendor B’s products.) We also normalized the results to be sure that nobody had an advantage based on number of responses.

Each customer was required to answer based on a specific design project that they’d already completed, and they were allowed to give responses based only on the products and services they’d actually used. This year, over 350 design teams answered our call to rate their experience with FPGA and EDA companies’ products and services. We use a super-secret balloting system and carefully guard the data to prevent any unscrupulous parties from tampering with the results.

The response was fantastic, and everybody wanted to know how they could participate in this year’s awards process… It doesn’t work that way, of course. Last year, with little fanfare, we presented our first annual FPGA Journal Reader’s choice awards.
