
Search by job, company or skills
• Design and implement enterprise-level data architecture solutions across relational, dimensional, and NoSQL data platforms.
• Evaluate, integrate, and manage CIAM, CDP, and CRM systems for large-scale enterprise environments.
• Work extensively with data ingestion pipelines, ETL processes, and data integration protocols.
• Develop and maintain database solutions using RDBMS, NoSQL, and analytic data platforms.
• Implement customer identity management solutions using protocols such as OAuth, OpenID Connect, and SAML.
• Design scalable AdTech systems including large-scale ad delivery and digital marketing data platforms.
• Collaborate with business and technical teams to support digital marketing and e-commerce data strategies.
• Ensure data governance, security, scalability, and performance across enterprise systems.
• Develop database scripts and optimize data models for high-performance systems.
• Provide architectural guidance for CRM and identity management solutions.
Logicplanet IT Services (India) Pvt. Ltd., incorporated in 2007 and headquartered in Hyderabad, operates as a software publishing, consulting, and IT solutions provider. The company delivers enterprise technology services including software development, digital transformation, and IT staffing solutions. With expertise in areas such as embedded systems, QA automation, ERP, and cloud technologies, Logicplanet supports global clients by combining technical innovation with workforce solutions, positioning itself as both a technology partner and a recruitment facilitator.
Job ID: 147443853
Skills:
data engineering , snowflake , Java, Spark SQL, BigQuery, Oracle Sql Server, T-sql, Power Bi, Scala, Tableau, Azure Databricks, Redshift, Sql, ELT, Azure Synapse, Azure Data Factory, MySQL, Python, Etl, Unity Catalog, Delta Lake, Business Intelligence
Skills:
bigtable , Java, Rust, Cassandra, Scala, Dynamodb, Kafka, Distributed Systems, Elasticsearch, Spark, scalable data stores, ClickHouse, Go, Flink, Beam, backend architecture, Pulsar, batch data pipelines
Skills:
Data Quality Governance Observability, Product Data Architecture Modelling, Performance Reliability Cost-Aware Design, Data Pipeline Lake Lakehouse Design, Integration with AI Analytics Consumption, Cross-Functional Delivery Stakeholder Management
Skills:
Databricks, Data Modeling, Data Warehousing, ETL ELT patterns, Agile Scrum delivery, Lakehouse patterns, Delta Lake, Batch and streaming data processing, Data ingestion frameworks, Databricks Workflows, Unity Catalog
Skills:
Devops, Azure Data Factory, Pyspark, Databricks, Sql, Python, Etl, ELT, Data Lake Storage
We don’t charge any money for job offers