Shows & Panels
- Accelerate and Streamline for Better Customer Service
- Ask the CIO
- The Big Data Dilemma
- Carrying On with Continuity of Operations
- Client Virtualization Solutions
- Data Protection in a Virtual World
- Expert Voices
- Federal Executive Forum
- Federal IT Challenge
- Federal Tech Talk
- Feds in the Cloud
- Health IT: A Policy Change Agent
- IT Innovation in the New Era of Government
- Making Dollars And Sense Out of Data Center Consolidation
- Navigating the Private Cloud
- One Step to the Cloud, Two Steps Toward Innovation
- Path to FDCCI Compliance
- Take Command of Your Mobility Initiative
Shows & Panels
Agencies and universities are refining job descriptions, revamping training and education programs and helping industry, academia and government to begin to reach consensus on the makeup of a modern-day cybersecurity workforce. The Office of Personnel Management also has made changes to personnel systems so that job descriptions map to the framework. The plan already has had in impact on cyber education at colleges and universities across the country.
A number of agencies have made high-profile migrations to cloud platforms and the Obama administration has issued sweeping guidance mandating agencies identify and transition services and applications to host in the cloud. For a look at how agencies are faring in their shifts to the cloud and the issues they continue to face, the Federal Drive with Tom Temin and Emily Kopp hosted a panel discussion, "Clearing the Fog Around Cloud Computing," sponsored by Level 3 Communications.
The agency plans to release solicitations to help agencies implement sensors to detect threats, followed by industry-provided services to analyze them. Congress approved $183 million to begin in 2013 to help get continuous monitoring off the ground more quickly.
The Government Accountability Office said reports of malware targeting mobile devices have nearly tripled in less than a year.
The National Institute of Standards and Technology wants comments on new draft guidelines for securing Basic Input Output System systems. BIOS is the first software activated after turning a computer on and has increasingly become a new target for hackers.
Instead of using a lengthy security technical implementation guide approval process to decide which tablets and smartphones will be allowed to use its network, the Defense Information Systems Agency wants to put the ball in the vendors' court.
An updated how-to guidance for responding to cyber incidents is out.
The National Institute of Standards and Technology is making it easier for agencies to test the use of logical access control for applications.
The General Services Administration will hold a vendor day Aug. 7 in Washington, D.C. The concept of identity management in the cloud builds on the efforts included in the National Strategy for Trusted Identities in Cyberspace (NSTIC).
New guidelines could help agencies adopting bring-your-own-device strategies manage the potential risks smartphones and tablets could pose.
How is the government using big data currently?
Christopher Fountain, senior vice president of SecureInfo joins host John Gilroy to talk about IT security.
July 10, 2012
The agency will hold a workshop July 25 to review the second draft of FIPS 201-2.
NIST launched the National Cybersecurity Center of Excellence in February and now is giving industry details on how it will work. The center's goal is to bring businesses and government together to solve cyber problems.
T.K. Keaninni, chief technology officer for nCircle joins host John Gilroy to talk about how his company can help your agency with its network security issues.
June 26, 2012
NIST, DHS experts say protecting smartphones and tablets shouldn't be any different than securing typical desktop or laptop computers. DHS will release mobile security reference architecture to help agencies understand common concepts. NIST is updating security control guide with 250 new requirements, including mobile controls.
Jacob Taylor, a physicist at the National Institute of Standards and Technology, is a finalist for a 2012 Service to America Medal.
Big data enthusiasts from government, industry and academia are getting their hands dirty. The National Institute of Standards and Technology and the National Science Foundation held a two-day workshop recently to explore the technologies needed to collect and analyze big data. Attendees also examined how big data can enhance areas like science, health and security. The government announced in March its plans to invest $200 million dollars in the growing field.
The National Institute of Standards and Technology is trying to demystify cloud computing for federal agencies. They've just published the final version of a document called Cloud Computing Synopsis and Recommendations. In it, NIST aims to provide a plain-language breakdown of how clouds are deployed, what services they can offer, typical terms of service, and security issues. NIST says the publication is aimed at IT decision makers, designed to help them decide what cloud technologies and configurations will meet their needs.
The group will create a white paper with recommendations this summer to modernize the 10-year-old policy. Among the areas they are looking at are continuous monitoring, cloud computing, shared services and the definition of a system. Updating A-130 will help agencies move from a 'checklist' mode to monitoring systems in real time for threats and vulnerabilities, said Frank Reeder, a former OMB official.