Archives AI News

Apple Watch Series 11 announced with 5G and stronger glass

apple sept 9 2025 108

Apple just announced the Apple Watch Series 11, which looks similar to its predecessors but packs new features like 5G cellular connectivity and stronger glass. The Series 11 is the first Apple with 5G connectivity, an upgrade from previous versions that relied on a slower 4G LTE connection. It will also be the first Apple […]

The Apple Watch Ultra 3 has 42 hours of battery life and satellite connectivity

lcimg 93366e50 35e6 4807 bec7 48247ac910a1

Apple has announced its next high-end Apple Watch, the Apple Watch Ultra 3. The new Ultra follows the Apple Watch Ultra 2, which Apple announced in 2023; the black version of the Ultra 2, which Apple introduced last year; and the original Apple Watch Ultra, which Apple launched in 2022. Apple’s Ultra watches are generally […]

iPhone 17 event live blog: on the ground at Apple’s keynote

live from apple event

It's time for another "awe dropping" Apple event. The company is expected to announce the iPhone 17 today, alongside some new Apple Watches and perhaps the AirPods Pro 3. We got a glimpse of some software at WWDC 2025, but today is all about the new hardware, baby. We're anticipating the usual - the iPhone […]

Introducing the Agentic SOC Workshops for security professionals

The security operations centers of the future will use agentic AI to enable intelligent automation of routine tasks, augment human decision-making, and streamline workflows. At Google Cloud, we want to help prepare today’s security professionals to get the most out of tomorrow’s AI agents. As we build our agentic vision, we’re also excited to invite you to the first Agentic SOC Workshop: Practical AI for Today's Security Teams. This complimentary, half-day event series is designed for security practitioners looking to level up their AI skills and move beyond the marketing to unlock AI's true potential. Ultimately, we believe that agentic AI will empower security professionals to focus more on complex investigations and strategic initiatives, and drive better security outcomes and operational efficiency. Our vision is a future where every customer has a virtual security assistant --- trained by the world's leading security experts --- that anticipates threats and recommends the best path to deliver on security goals. We are building the next class of security experts empowered by AI, and these workshops are your opportunity to become one of them. How the Agentic SOC Workshop can boost your security skills The Agentic SOC Workshop combines foundational security capabilities with AI to help security professionals develop the necessary skills for successful AI use. Attendees will: Explore the agentic SOC future: Learn about Google Cloud's vision for the future of security operations, where agent AI systems automate complex workflows and empower analysts to focus on high-impact tasks. Learn by doing: Dive into a practical, real-world AI workshop tailored for security practitioners. Learn how to use Google Cloud's AI and threat intelligence to automate repetitive tasks, reduce alert fatigue, and improve your security skills. Participate in a dynamic Capture the Flag challenge: Put your new skills to the test in an interactive game where you use the power of AI to solve challenges and race to the finish line. Meet and network with peers: Gain valuable insights from industry peers and hear from other customers on their journey to modernize security operations. Connect with peers, partners, and Google experts during networking breaks concluding with a happy hour. Discover practical uses for AI: Learn how to use Gemini in Google Security Operations to respond to threats faster and more effectively.. Join us in a city near you These free, half-day workshops are specifically designed for security professionals, including security architects, SOC managers, analysts, and security engineers, as well as security IT decision-makers including CISOs and VPs of security. We’ll be holding Agentic SOC Workshops starting in Los Angeles on Wednesday, Sept. 17, and Chicago on Friday, Sept. 19. Workshops will continue in October in New York City and Toronto, with more cities to come. To register for a workshop near you, please check out our registration page.

Accelerate data science with new Dataproc multi-tenant clusters

1 Architecture Z7UtB91.max 1000x1000 1

With the rapid growth of AI/ML, data science teams need a better notebook experience to meet the growing demand for and importance of their work to drive innovation. Additionally, scaling data science workloads also creates new challenges for infrastructure management. Allocating compute resources per user provides strong isolation (the technical separation of workloads, processes, and data from one another), but may cause inefficiencies due to siloed resources. Shared compute resources offer more opportunities for efficiencies, but with a sacrifice in isolation. The benefit of one comes at the expense of the other. There has to be a better way… We are announcing a new Dataproc capability: multi-tenant clusters. This new feature provides a Dataproc cluster deployment model suitable for many data scientists running their notebook workloads at the same time. The shared cluster model allows infrastructure administrators to improve compute resource efficiency and cost optimization without compromising granular, per-user authorization to data resources, such as Google Cloud Storage (GCS) buckets. This isn't just about optimizing infrastructure; it's about accelerating the entire cycle of innovation that your business depends on. When your data science platform operates with less friction, your teams can move directly from hypothesis to insight to production faster. This allows your organization to answer critical business questions faster, iterate on machine learning models more frequently, and ultimately, deliver data-powered features and improved experiences to your customers ahead of the competition. It helps evolve your data platform from a necessary cost center into a strategic engine for growth. aside_block <ListValue: [StructValue([('title', '$300 in free credit to try Google Cloud data analytics'), ('body', <wagtail.rich_text.RichText object at 0x3dfceca8c700>), ('btn_text', ''), ('href', ''), ('image', None)])]> How it works This new feature builds upon Dataproc's previously established service account multi-tenancy. For clusters in this configuration, only a restricted set of users declared by the administrator may submit their workloads. Administrators also declare a mapping of users to service accounts. When a user runs a workload, all access to Google Cloud resources is authenticated only as their specific mapped service account. Administrators control authorization in Identity Access Management (IAM), such as granting one service account access to a set of Cloud Storage buckets and another service account access to a different set of buckets. As part of this launch, we've made several key usability improvements to service account multi-tenancy. Previously, the mapping of users to service accounts was established at cluster creation time and unmodifiable. We now support changing the mapping on a running cluster, so that administrators can adapt more quickly to changing organizational requirements. We've also added the ability to externalize the mapping to a YAML file for easier management of a large user base. Jupyter notebooks establish connections to the cluster via the Jupyter Kernel Gateway. The gateway launches each user's Jupyter kernels, distributed across the cluster's worker nodes. Administrators can horizontally scale the worker nodes to meet end user demands either by manually adjusting the number of worker nodes or by using an autoscaling policy. Notebook users can choose Vertex AI Workbench for a fully managed Google Cloud experience or bring their own third-party JupyterLab deployment. In either model, the BigQuery JupyterLab Extension integrates with Dataproc cluster resources. Vertex AI Workbench instances can deploy the extension automatically, or users can install it manually in their third-party JupyterLab deployments. Under the hood Dataproc multi-tenant clusters are automatically configured with additional hardening to isolate independent user workloads: All containers launched by YARN run as a dedicated operating system user that matches the authenticated Google Cloud user. Each OS user also has a dedicated Kerberos principal for authentication to Hadoop-based Remote Procedure Call (RPC) services, such as YARN. Each OS user is restricted to accessing only the Google Cloud credentials of their mapped service account. The cluster’s compute service account credentials are inaccessible to end user notebook workloads. Administrators use IAM policies to define least-privilege access authorization for each mapped service account. How to use it Step 1: Create a service account multi-tenancy mappingPrepare a YAML file containing your user service account mapping, and store it in a Cloud Storage bucket. For example: code_block <ListValue: [StructValue([('code', 'user_service_account_mapping:rn bob@my-company.com: service-account-for-bob@iam.gserviceaccount.comrn alice@my-company.com: service-account-for-alice@iam.gserviceaccount.com'), ('language', ''), ('caption', <wagtail.rich_text.RichText object at 0x3dfceca8c1f0>)])]> Step 2: Create a Dataproc multi-tenant clusterCreate a new multi-tenant Dataproc cluster using the user mapping file and the new JUPYTER_KERNEL_GATEWAY optional component. code_block <ListValue: [StructValue([('code', 'gcloud dataproc clusters create my-cluster \rn --identity-config-file=gs://bucket/path/to/identity-config-file \rn --service-account=cluster-service-account@iam.gserviceaccount.com \rn --region=region \rn --optional-components=JUPYTER_KERNEL_GATEWAY \rn other args ...'), ('language', ''), ('caption', <wagtail.rich_text.RichText object at 0x3dfceca8c940>)])]> If you need to change the user service account mapping later, you can do so by updating the cluster: code_block <ListValue: [StructValue([('code', 'gcloud dataproc clusters update my-cluster \rn --identity-config-file=gs://bucket/path/to/identity-config-file \rn --region=region'), ('language', ''), ('caption', <wagtail.rich_text.RichText object at 0x3dfceca8c820>)])]> Step 3: Create a Vertex AI Workbench instance with Dataproc kernels enabledFor users of VertexAI Workbench, create an instance with Dataproc kernels enabled. This automatically installs the BigQuery JupyterLab extension. Step 4: Install the BigQuery JupyterLab extension in third-party deploymentsFor users of third-party JupyterLab deployments, such as running on a local laptop, install the BigQuery JupyterLab extension manually. Step 5: Launch kernels in the Dataproc clusterOpen the JupyterLab application either from a Vertex AI Workbench instance or on your local machine. The JupyterLab Launcher page opens in your browser. It shows the Dataproc Cluster Notebooks sections if you have access to Dataproc clusters with the Jupyter Optional component or Jupyter Kernel Gateway component. To change the region and project: Select Settings > Cloud Dataproc Settings. On the Setup Config tab, under Project Info, change the Project ID and Region, and then click Save. Restart JupyterLab to make the changes take effect. Select the kernel spec corresponding to your multi-tenant cluster. Once the kernel spec is selected, the kernel is launched and it takes about 30-50 seconds for the kernel to go from Initializing to Idle state. Once the kernel is in Idle state, it is ready for execution. Get started with multi-tenant clusters Stop choosing between security and efficiency. With Dataproc's new multi-tenant clusters, you can empower your data science teams with a fast, collaborative environment while maintaining centralized control and optimizing costs. This new capability is more than just an infrastructure update; it's a way to accelerate your innovation lifecycle. This feature is now available in public preview. Get started today by exploring the technical documentation and creating your first multi-tenant cluster. Your feedback is crucial as we continue to evolve the platform, so please share your thoughts with us at dataproc-feedback@google.com.

All the news from Apple’s iPhone 17 keynote

iPhone 17 launch 1

Apple is ready to announce the iPhone 17 line, and this promises to be a bigger launch than most. We’re expecting a major redesign to the cameras and looking forward to an entirely new addition to the lineup: the extra-thin iPhone 17 Air. Alongside the new iPhones, Apple is likely to update its Apple Watch […]

Inside the Man vs. Machine Hackathon

At a weekend hackathon in San Francisco, more than 100 coders gathered to test whether they could beat AI—and win a $12,500 cash prize.