Overall Survey Results:


ICMC 2013 Participant Survey


1Please select a category that
best describes your participation
 
Answer
0%100%
Number of
Responses
Response
Ratio
Developer of Cryptographic Modules
   
19 43.1%
Accredited Laboratory
   
14 31.8%
Consultant
   
4 9.0%
Government Organization (Validation Program)
   
2 4.5%
End-User of Cryptographic Modules
   
5 11.3%
Provider of Support Equipment and Tools
   
2 4.5%
Other
   
2 4.5%
 Totals44100%
2 Which of the conference tracks
did you spend the most time attending?
 
Answer
0%100%
Number of
Responses
Response
Ratio
Technical Track
   
21 47.7%
Certification Programs Track
   
25 56.8%
Other
   
2 4.5%
 Totals44100%
3 Tuesday, September 24,
Pre-conference Workshops: Please rate the overall quality
of speakers and topics for any of the following sessions which you attended.
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
Introduction to FIPS 140-2 Steve Weingart, Cryptographic & Security Testing Laboratory Manager, atsec information security
   
23 4.3
Introduction to Side-Channel Analysis and Testing Gary Kenworthy, Gilbert Goodwill, Cryptography Research
   
19 3.5
Physical Security for FIPS 140-2 Steve Weingart, Cryptographic and Security Testing Laboratory Manager, atsec information security
   
24 4.1
The Cryptographic Module and Beyond for Data Protection in a Mobile World Eugene Liderman, Sriram Krishnan, Good Technology
   
12 2.1
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


4Wednesday, September 25, Plenary Sessions: Please rate the overall quality of speakers and topics for any of the
following sessions which you attended.
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
Keynote, Charles H. Romine, Director, Information Technology Laboratory, NIST
   
40 3.4
Keynote, Dr Bertrand du Castel, Schlumberger Fellow and Java Card Pioneer
   
39 3.1
CAVP/CMVP Status-What's Next? Sharon Keller, NIST; Carolyn French, CSEC; Randall Easter, NIST
   
40 3.5
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


5Wednesday, September 25, Certification Programs Track: Please rate the overall quality of speakers and topics for any of the
following sessions which you attended.
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
High Impact CMVP and FIPS 140-2 Implementation Guidance Kim Schaffer, Apostol Vassilev, Jim Fox, NIST
   
31 3.4
The Current Status of CMVP in Japan Junichi Kondo, Director JCMVP, at IPA The Current Status of CMVP in Korea Yongdae Kim, Researcher, ETRI Commercial Cryptography in Europe Helmut Kurth, Chief Scientist, atsec information security
   
10 3.5
Panel: How Can the Validation Queue for the CMVP be Improved Moderator: Fiona Pattinson, Director of Strategy and Business Development, atsec information security Panelists: Michael Cooper, NIST; Steve Weymann, Infogard; James McLaughlin, Gemalto
   
30 3.3
Building a Certification Program: Techniques That (Might) Work Tammy Green, Security Architect and Vulnerability Response Director, Blue Coat Systems
   
25 4.0
Understanding the FIPS Government Crypto Regulations for 2014 Edward Morris, Co-founder, Gossamer Security Solutions
   
22 4.0
FIPS and FUD Ray Potter, CEO, SafeLogic
   
17 2.9
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


6Wednesday, September 25, Technical Track: Please rate the overall quality of speakers and topics for any of the
following sessions which you attended.
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
Physical Security Protection Based on Non-deterministic Configuration of Integrated Microelectronic Security Features Silvio Dragone, IBM Research-Zurich
   
9 3.4
Non-Invasive Attack Testing: Feedback on Relevant Methods Sylvain Guilley, TELECOM-ParisTech; Robert Nguyen, Laurent Sauvage, Secure-IC
   
8 3.5
Test Vector Leakage Assessment (TVLA) Methodology in Practice Gilbert Goodwill, Gary Kenworthy, Pankaj Rohatgi, Cryptography Research
   
15 3.3
Electro Magnetic Fault Injection in Practice Rajesh Velegalati, Jasper Van Woudenberg, Riscure
   
12 3.3
Security Mechanisms, Services, Protocols and Architecture for Secure Mobile Applications, Sead Muftic, Professor in Computer Networks Security, SETECS
   
13 2.9
How Random is Random?, Helmut Kurth, Chief Scientist, atsec information security
   
26 3.8
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


7Thursday, September 26, Certification Programs Track: Please rate the overall quality of speakers and topics for any of the following
sessions which you attended.
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
Reuse of Validation or Other Third party Components in a 140-2 Cryptographic Module Jonathan Smith, Senior Cryptographic Equipment Assessment Laboratory (CEAL) Tester, CygnaCom Solutions.
   
13 3.5
The Upcoming Transition to New Algorithms and Key Sizes Allen Roginsky, Kim Schaffer, NIST
   
23 3.6
Approved vs. Non-Approved: Decoding the Language Yi Mao, atsec information security
   
18 3.3
Pattern Based FIPS 140-2 Cryptographic Module Validation Steve Weymann, Mark Minnoch, InfoGard Laboratories
   
13 3.5
Securing the Supply Chain for COTS ICT Products Sally Long, Open Group Trusted Technology Forum
   
13 3.2
Implementation and Assessment on Cryptography for Payment Solutions Yan Liu, atsec information security
   
11 3.1
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


8Thursday, September 26, Technical Track: Please rate the overall quality of speakers and topics for any of the
following sessions which you attended.
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
Entropy: Order From Disorder Tim Hall, Apostol Vassilev, NIST
   
28 3.5
SP800-90 - Reviewing the Standard, Stephan Mueller, Principal Consultant and Evaluator atsec information security
   
20 3.5
Software in Silicon: Crypto-Capable Processors Wajdi Feghali, Intel; Valerie Fenwick, Darren Moffat, Oracle Solaris Security; David Weaver, Oracle SPARC Hardware
   
14 3.9
Key Management Overview Allen Roginsky, Kim Shaffer, NIST
   
26 3.5
ISO/IEC 19790 Status and Supporting Documents Miguel Bagnon, Epoche & Espri; Randall Easter, NIST
   
19 3.9
Panel: Everything You Always Wanted to Know About Labs, (But Were Afraid to Ask...) Moderator: Fiona Pattinson, Director Strategy and Business Development, atsec information security
   
20 3.3
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


9How would you rate the following aspects of the conference presentations overall?
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
The overall quality of the speakers
   
45 3.7
The value of the topics covered during the conference
   
44 3.9
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


10How would you rate the following
aspects of ICMC 2013?
 
 1 = Poor , 2 = Fair , 3 = Good , 4 = Very Good , 5 = Excellent
Answer
12345
Number of
  Responses
Rating
Score*
Pre-conference services, including online information, registration, communications and customer service
   
43 3.7
The quality of the vendor exhibits
   
41 3.3
The quality of the hotel conference facilities
   
43 3.2
The quality of the food service
   
43 3.1
The quality of the audio-visual presentation
   
43 3.5
*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.


11Would you plan to attend the
ICMC conference in the future?
 
Answer
0%100%
Number of
Responses
Response
Ratio
Yes, would definitely plan to attend
   
29 64.4%
Maybe would plan to attend
   
16 35.5%
No, would not plan to attend
 
0 0.0%
No Responses
 
00.0%
 Totals45100%

12Please give us your general
impression of the event, and any suggestions for improvement in the future,
including topics or speakers to include.
 
 Number of
Responses
  21
13Where would you be most likely
to attend ICMC in the future? (check any that apply)
 
Answer
0%100%
Number of
Responses
Response
Ratio
Washington, DC
   
37 90.2%
Ottawa, Canada
   
25 60.9%
Other
   
8 19.5%
 Totals41100%
14What types of meeting facilities and destinations do you prefer? (check any that apply)
 
Answer
0%100%
Number of
Responses
Response
Ratio
A four-star hotel
   
14 37.8%
An inexpensive hotel
   
6 16.2%
A city-center hotel
   
21 56.7%
A suburban hotel
   
7 18.9%
An airport hotel
   
14 37.8%
Other
   
3 8.1%
 Totals37100%
15Do you use social networks? (check any that apply)
 
Answer
0%100%
Number of
Responses
Response
Ratio
Twitter
   
5 16.1%
LinkedIn
   
29 93.5%
Facebook
   
18 58.0%
Google+
   
5 16.1%
Other
   
4 12.9%
 Totals31100%
16OPTIONAL: Contact Information (We may seek further input from some attendees)
 
AnswerNumber of
Responses
First Name 18
Last Name 18
Company Name 18
Email Address 18

   
Online Surveys  by