Bear Metal

Engineering Tomorrow’s Systems

Bear Metal OÜ is an experienced systems integrator specializing in high-volume, low-latency data streaming, communications infrastructure, edge computing, hardware design, and embedded development.

We help unlock tactical, near-real-time data streams from decentralized edge devices and vehicles to fuel deconfliction, reconfiguration, Machine Learning, and AI, while ensuring interoperability with regulatory frameworks.

Known for implementing large-scale, mission-critical systems in Commerce and Automotive, we are expanding into Unmanned Vehicle Technologies. We develop secure, compliant, vertically integrated autonomous solutions for Europe while maintaining our commitment to reliability and innovation.

Data Processing

We possess a deep understanding of the trade-offs inherent in batch, stream, and incremental processing. Our experience includes designing, leading, and operating planetary-scale data delivery solutions with low latency.

We recognize that data mesh initiatives, which emphasize strong domain-specific and semantic ownership, are crucial for preventing the siloing and drift of operational data from analytical data and insights.

Furthermore, we have frequently observed the pitfalls of extensive and costly data copying, which often leads to multiple low-quality “systems of truth.” A more integrated approach, leveraging cost-effective change notifications, can effectively mitigate this issue.

Capabilities

  • Defining and evolving event schemas and structures.
  • Emitting events for both exactly-once delivery guarantees and lossy analytical use cases.
  • Implementing Change Data Capture at extreme scale
  • Performing Stateful and Stateless Stream Processing with manual DAG control as well as SQL abstractions.
  • Optimizing the query layer experience, including SQL, Apache Calcite, and query federation.
  • Implementing query-after-notify patterns.

Technologies

Data Ingestion and Streaming

  • Apache Kafka: A distributed streaming platform for high-throughput, fault-tolerant ingestion and processing of event streams.
  • Kafka Connect: A framework for scalably and reliably streaming data between Apache Kafka and other data systems.
  • Debezium: An open-source distributed platform for real-time change data capture (CDC) from databases to Kafka.
  • Confluent Schema Registry: A centralized repository for managing data schemas to ensure quality and compatibility across the streaming ecosystem.

Real-time Data Processing and Transformation

  • Apache Flink: A powerful real-time stream processing engine for stateful computation over unbounded and bounded data streams. Exposure to diverse sinks.
  • Apache Calcite: A dynamic data management framework providing SQL parsing, validation, and optimization for various data sources.

Cloud-Based Data Warehousing and Processing

  • GCP Dataflow: A fully managed service for executing data processing pipelines, supporting both batch and stream processing with auto-scaling.
  • BigQuery: A fully managed, serverless data warehouse designed for highly scalable and cost-effective analytics on petabytes of data.

Data Presentation

Hightlights

  • Ingested tens of millions of events per second sustained
  • Provided intuitive solutions for seamless tenant migration across planetary-scale jurisdictions.