Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    2,000 10gb nic jobs found, pricing in NZD

    ...developers / power users to source data and establish automated integration; experience with integrating relational databases and APIs from web services is crucial. - Data Consolidation: You should be able to create a cohesive, consolidated dataset from these different sources applying appropriate design principles. - Medium-sized project: The data needing integration and consolidation is between 1GB and 10GB, so experience handling medium-sized data sets is ideal. Your main aim will be to ensure that the data warehouse is efficient and can handle the needs of our data integration and consolidation. Experience in similar projects is essential. You will be working with roughly 5 databases, some direct access, some via API (including basic XML feed in 1 instance). We do not have e...

    $21 / hr (Avg Bid)
    $21 / hr Avg Bid
    5 bids

    My project involves migrating a medium-sized database (between 1GB and 10GB) to Oracle, and optimizing its performance. The database schema will also need to be designed, as this is not yet finalized. Key responsibilities include: - Migrating existing data: I need an expert who can handle this without losing any data or causing downtime. - Performance optimization: Once the data is migrated, the database needs to be optimized for speed and efficiency. - Designing the database schema: Your ability to design a schema that aligns with our business goals is critical for this project’s success. Ideal candidates should have extensive experience in Oracle database development, with a focus on data migration and performance optimization. A proven track record in designing database s...

    $17 (Avg Bid)
    $17 Avg Bid
    2 bids

    ...developers / power users to source data and establish automated integration; experience with integrating relational databases and APIs from web services is crucial. - Data Consolidation: You should be able to create a cohesive, consolidated dataset from these different sources applying appropriate design principles. - Medium-sized project: The data needing integration and consolidation is between 1GB and 10GB, so experience handling medium-sized data sets is ideal. Your main aim will be to ensure that the data warehouse is efficient and can handle the needs of our data integration and consolidation. Experience in similar projects is essential. You will be working with roughly 5 databases, some direct access, some via API (including basic XML feed in 1 instance). We do not have e...

    $39 / hr (Avg Bid)
    $39 / hr Avg Bid
    47 bids

    I have a high-complexity T-SQL stored procedure used for data analysis that I need translated into PySpark code. The procedure involves advanced SQL operations, temporary tables, and dynamic SQL. It currently handles over 10GB of data. - Skills Required: - Strong understanding and experience in PySpark and T-SQL languages - Proficiency in transforming high complexity SQL scripts to PySpark - Experience with large volume data processing - Job Scope: - Understand the functionality of the existing T-SQL stored procedure - Rewrite the procedure to return the same results using PySpark - Test the new script with the provided data set The successful freelancer will assure that the new PySpark script can handle a large volume of data and maintain the same output as the pres...

    $287 (Avg Bid)
    $287 Avg Bid
    13 bids

    I urgently require an experienced Gitlab expert to perform an upgrade of our large (more than 10GB) Gitlab installation from version 7.9 to the latest version. Key Requirements: - Ensure successful migration with no data loss. - Anticipate any potential issues that might arise due to the jump in versions, and propose mitigations - Test and ensure existing functionality continues to work after the upgrade Ideal Freelancer: - Will have extensive experience with Gitlab, specifically with version upgrades - Should hold good problem-solving abilities and foresight to handle unexpected issues - Have handled large-scale projects with sizable repositories and ensured smooth migration Please provide examples of similar migrations you've conducted in the past to prove your ability ...

    $190 (Avg Bid)
    $190 Avg Bid
    2 bids

    I have recently returned from a memorable journey in the Caribbean and am looking to transform about 125 rawfootage videos (drone, camera)and 10GB of footage into a captivating, 3-4 minute cinematic video. Key Requirements: - Objective: The main purpose of this video is to document my journey and experiences in the Caribbean. Therefore, the video should be more of a visual diary rather than a promotional or commercial piece. - Style: I have a preference for a fast, cinematic style of video production. I want the video to be visually engaging and immersive to the audience. - Elements: To add a creative touch to the video, I would like to include the following elements: - Text overlays: To provide context or additional information throughout the video. - Special effects: To enha...

    $185 (Avg Bid)
    $185 Avg Bid
    53 bids

    I'm in urgent need of a skilled PostgreSQL DBA to take on various tasks and enhancements in my database system. The database is currently large, ranging from 10GB to 100GB, and requires immediate attention to migrate it to a more scalable solution. Key tasks include but are not limited to: - Database performance optimization - Backup and recovery implementation - Database migration to a more scalable solution - Database upgrade with near zero downtime - Repmgr configuration and troubleshooting - Ora2pg oracle to Postgres migration The ideal candidate should be an expert in PostgreSQL with experience in handling large databases. You should have a deep understanding of database performance optimization techniques, demonstrated expertise in backup and recovery, and a proven tra...

    $5 / hr (Avg Bid)
    $5 / hr Avg Bid
    3 bids

    I am in need of a proficient developer with a deep understanding of Azure DevOps and VM environments to help me set up a project for continuous integration and deployment. Key aspects of the project include: - The deployment of the project on an Azure VM using Azure DevOps pipelines - Ensure the system is designed to handle 1-10GB of data - The VM size should ideally be general-purpose This position requires not only technical knowledge but also the ability to understand project requirements and turn them into practical solutions. The ideal candidate will have a demonstrated track record in working with Azure DevOps and VMs, specifically within a continuous integration and continuous deployment context. Please ensure that your proposal details your relevant experience in these a...

    $151 (Avg Bid)
    $151 Avg Bid
    12 bids

    I am in dire need of a proficient machine learning engineer who is well-experienced in model sele...proficient machine learning engineer who is well-experienced in model selection, training, and data preprocessing. An ideal candidate would be one who excels in Python programming language since it is what I am most comfortable with. Additionally, you should be adept in handling a medium-sized dataset (1GB-10GB). In short, here's what I need: - Proficiency in Python programming - Robust experience with data preprocessing, model selection, and training - Ability to handle a medium-sized dataset (1GB-10GB) This project offers a great opportunity for those looking to flex their machine learning engineering muscles. I am excited to collaborate and see what your expertise can...

    $843 (Avg Bid)
    $843 Avg Bid
    55 bids

    I'm looking for an AWS Glue expert to handle my large MySQL database, over 10GB, and write it to S3. You will need to have: - A strong understanding and experience in handling large databases. - Proficiency in using AWS Glue for data ETL processes. - Knowledge in determining the best strategy to quickly load large amounts of data. - Experience working with Parquet output format. The database gets updated daily, so a fast and efficient process is required to ensure optimal performance. Please detail your experience with similar projects when making your bid.

    $23 / hr (Avg Bid)
    $23 / hr Avg Bid
    31 bids

    I am struggling with a range of issues in my SQL database that are affecting my business operations. The primary challenges involve: - Slow database performance. - High CPU usage rate. - Regular timeouts and deadlock situations. The size of my database is substantial, significantly above 10GB, implying the need for a professional who can manage and improve large-capacity systems. The target areas for amelioration in the SQL database include: - Efficiency of the queries. - Table partitioning strategy. The ideal candidate should evidence expert skills and a successful track record in managing large SQL databases, improving query efficiency, and implementing effective table partitioning strategies. In-depth knowledge of comprehensive database performance tuning and optimization t...

    $38 / hr (Avg Bid)
    $38 / hr Avg Bid
    64 bids

    I'm currently using SQL Express and I'm looking to upgrade to the Standard Edition. I also need assistance with backing up and restoring my medium-sized database (1GB to 10GB). Key Requirements: - Upgrade SQL Express to SQL Standard Edition - Backup and restore an existing medium-sized database - Ensure the process is seamless and data integrity is maintained Ideal Skills and Experience: - Proven experience with SQL Server upgrades - Excellent understanding of database backup and restore processes - Familiarity with SQL Express and Standard Edition - Strong attention to detail to ensure data integrity Please apply if you have a successful track record in similar projects.

    $1652 (Avg Bid)
    $1652 Avg Bid
    81 bids

    As the professional handling this project, you'll engage with big data exceeding 10GB. Proficiency in Python, Java, and Pyspark are vital for success as we demand expertise in: - Data ingestion and extraction: The role involves managing complex datasets and running ETL operations. - Data transformation and cleaning: You'll also need to audit the data for quality and cleanse it for accuracy, ensuring integrity throughout the system. - Handling Streaming pipelines and Delta Live Tables: Mastery of these could be game-changing in our pipelines, facilitating the real-time analysis of data.

    $31 / hr (Avg Bid)
    $31 / hr Avg Bid
    35 bids

    As a business with a medium-sized SQL database, my aim is to enhance data interpretation and decision making by utilizing Microsoft Power BI for data visualization, interactive dashboards, and comprehensive reporting. Key Requirements: - Expertise in Microsoft Power BI to analyze database. - Experience with SQL and understanding of data volume (1GB - 10GB). - Ability to visualize data through charts and graphs, create interactive dashboards, and generate comprehensive reports using Power BI. - Familiarity with implementing security measures, including data encryption, user authentication, and permission-based access control. Please bid if you have the ideal skills and proven experience. Your bid will be evaluated based on your expertise, client feedback, and cost-effectiveness. L...

    $52 / hr (Avg Bid)
    $52 / hr Avg Bid
    97 bids

    I'm seeking an experienced professional to manually migrate a Wordpress website from an existing Ubuntu server to a new one. The size of the website is 8-10GB and there are specific plugins and themes installed that need to be carefully transferred to the new server. Key project requirements include: - Expertise in Ubuntu server administration and Wordpress migration - Proficiency in handling large website migrations - Experience in migrating specific plugins and themes Given that downtime is not a concern, the priority lies on ensuring that the migration is successful and that all elements of the current website are accurately transferred to the new server. The website must function correctly on the new server once the migration is complete.

    $31 (Avg Bid)
    $31 Avg Bid
    22 bids

    ...the installation is complete, launch Vistumbler. Caution Some AV software may indicate that Vistumbler is a virus. It might be necessary to temporarily turn off your AV software for this project. Be sure to turn AV back on when the project is completed. If necessary, expand the window to full screen. Click Scan APs. If no networks appear, click Interface and then select the appropriate wireless NIC interface. Note the columns Signal and High Signal. How could they be used in a site survey? Click Graph 1. Click one of the APs displayed at the bottom of the screen. Allow Vistumbler to accumulate data over several minutes. What information is displayed on this graph? Click Graph 2. Click another one of the APs displayed at the bottom of the screen. Allow Vistumbler to accumul...

    $26 (Avg Bid)
    $26 Avg Bid
    6 bids

    I'm looking for an expert with significant experience in setting up and configuring Amazon Web Services (AWS) Relational Database Service (RDS) for my MSSQL server. Or who can guide me to process with Google Cloud SQL. Key Details: - Database Size: The anticipated database size is medium, ranging between 10GB to 100GB. - Cloud Platform: The preferred cloud platform for this project is AWS. The project may require setting up several AWS features, however, I am unsure at the moment about the accessibility features needed. An ideal candidate for this project should be highly experienced in AWS, be capable of advising on the best setup to optimize performance, and have a strong understanding of MSSQL. Any expertise advising on high availability and load balancing would be ...

    $151 (Avg Bid)
    $151 Avg Bid
    25 bids

    I am seeking a professional with significant experience in working with both structured and unstructured data. The ideal candidate should be proficient in Spark and Tableau, as these are the primary tools for the project. The scope of the project involves processing data of medium volume, between 1GB and 10GB. Skills & Experience Required: - Proficiency in Spark - Tableau expertise - Demonstrated experience with structured and unstructured data - Proven ability to effectively work with medium scale data (1-10 GB)

    $16 / hr (Avg Bid)
    $16 / hr Avg Bid
    8 bids

    As the owner of a large SQL Server database, I'm looking for the expertise to convert my SQL Server code to Redshift SQL. The database in question is greater than 10GB and primarily contains valuable customer data. The ideal freelancer for this project should: - Have a strong understanding of SQL and Redshift - Experience in data conversion of Stored procedures, views, functions etc., specifically from SQL Server to Redshift - Familiarity with transforming and restructuring customer data Your job will be to ensure the correct and efficient migration of all customer data without loss of any rows. Be prepared to handle the larger data volume of this SQL Server database

    $29 / hr (Avg Bid)
    $29 / hr Avg Bid
    10 bids

    TAsk Details as follows: 1 Integrate Nagios to PostGresql 2 Integrate Nagios with iLOM/iLO/iDRAC snmp modules 3 Monitor System CPU, RAM, Storage and NIC ports and configure threasholds 4 Monitor Individual Processes CPU, RAM, IO Rate etc and Configure threashold (including DB and Webserver processes) 5 Stop and Start of the processes 6 Auto restart of the processes and keep track of start and stop 7 Monitoring connectivity 8 Email integration for thresholds 9 SMS integration (if applicable) 10 Dash boards Interested candidates may apply

    $467 (Avg Bid)
    $467 Avg Bid
    2 bids

    I am seeking an experienced IT professional or consultant to help correct high transfer charges I am experiencing between my S3 bucket and RDS service. The goal is to optimize content deliver...Configuring Amazon CloudFront to deliver content. The following would be the ideal skills or experience for the job: - Proficiency in AWS (Amazon Web Services) operations. - Specific experience and thorough understanding of S3 Bucket, RDS, and CloudFront. - Ability to troubleshoot and solve related configuration issues. Ideally, the worker will be familiar with handling large data (more than 10GB) consisting of primarily images and static files. They should be able to provide insights into how to optimize content delivery while minimizing costs to ensure the S3 and RDS services are used ef...

    $15 / hr (Avg Bid)
    $15 / hr Avg Bid
    5 bids

    TAsk Details as follows: 1 Integrate Nagios to PostGresql 2 Integrate Nagios with iLOM/iLO/iDRAC snmp modules 3 Monitor System CPU, RAM, Storage and NIC ports and configure threasholds 4 Monitor Individual Processes CPU, RAM, IO Rate etc and Configure threashold (including DB and Webserver processes) 5 Stop and Start of the processes 6 Auto restart of the processes and keep track of start and stop 7 Monitoring connectivity 8 Email integration for thresholds 9 SMS integration (if applicable) 10 Dash boards Interested candidates may apply

    $490 (Avg Bid)
    $490 Avg Bid
    1 bids

    I'm in need of a specialist to quickly set up my already registered domain. I have a domain, hosting and multiple emails for that domain. one email is about 10gb of data. I have purchased a hosting services in godaddy. i need the hosting, wordpress website and email ids to be transferred to godaddy and cpanel. Ideal Candidate: - Has extensive experience with domain hosting and email setup - Familiar with diverse hosting options - Can work fast, the project is needed ASAP

    $59 (Avg Bid)
    $59 Avg Bid
    26 bids

    I am in need of a Remote Desktop Protocol (RDP) with Windows 10 installed, capable of handling high-intensity software testing tasks. This project will require: Min. Intel Core i7-6700 or above Min. SSD SATA 250 GB or above Min. 4x RAM 16384 MB DDR4 or equivalent and above Min. NIC 1 Gbit Intel I219-LM Location: US , EU (English) 1 x Primary IPv4 1 Month Warranty.

    $20 (Avg Bid)
    $20 Avg Bid
    4 bids

    We are looking for CA. (Fresh/Exp) to register New Startup Company under Proprietor, Responsible to Prepare & Apply for all necessary documentation. collection ( Director KYC) of the company ( DETERMINE NIC CODE) of MOU , AOA for pan , Tan , GST for Corporate bank account 6. APPLICATION TO PAYMENT GATEWAY

    $196 (Avg Bid)
    $196 Avg Bid
    27 bids

    Require an experienced Microsoft SQL Server Database Administrator to help optimize my database, specifically focusing on improving its performance through Indexing, Query Optimization, and Data Caching. Key Details: - Several stored procedures are currently experiencing performance issues, requiring urgent optimization to ensure peak functionality. Note that we have DB of minimum 10GB. - This may involve revisiting our database architecture or restructuring the data model if need. - In addition we need Backup and Recovery Mechanisms, Database Health Check, partition and index Your primary objective will be to enhance our database procedures to facilitate rapid data retrieval. You will guide us through the administration process, offering expertise to maintain optimal performance...

    $34 / hr (Avg Bid)
    $34 / hr Avg Bid
    32 bids

    I need experienced Linux System Network Engineer to configure the networking features of my Almalinux server. The work primarily involves: - NIC Configuration: The server uses an Ethernet Network Interface Card (NIC). I require assistance to properly configure this Ethernet NIC. - Ethernet Configurations: The foremost Ethernet setting I want to implement is the static IP setup. Therefore, a profound understanding of IP addressing and subnetting is essential. In addition to Linux system administration, your skills and experience should ideally include Almalinux, NIC settings configuration and IP addressing. This should be a straightforward assignment for those familiar with networking on Linux servers. Please only bid if you have prior experience in the abo...

    $120 (Avg Bid)
    $120 Avg Bid
    14 bids

    I need a professional with proven Azure SSAS skills to...the ssas cube jobs which are failing - Improve our data analysis capabilities - Optimize our data model's performance - Enhance our reporting and visualization abilities This project needs to be completed hastily within 2 weeks. Candidates should be prepared for an accelerated pace and must demonstrate a strong ability to meet tight deadlines. The data size to be processed is between 1GB to 10GB. Experience with managing and optimizing similar or larger datasets on Azure SSAS is critical. Please apply if you have solid Azure SSAS credentials and can handle an urgent timeline. Your bid should reflect thorough understanding of the requirements and propose an executable approach to meet the objectives within the stipu...

    $18 / hr (Avg Bid)
    $18 / hr Avg Bid
    5 bids

    I'm in need of an experienced professional who can assist me in transferring a large-scale database. Key requirements include: - Proficiency in handling and transferring databases, specifically SQL Server. - Experience in migrating large databases, in this case, exceeding 10GB. - Demonstrated knowledge in handling data and ensuring its integrity. - Ability to ensure smooth transfer with minimal downtime. Your role will involve transferring selected data safely, efficiently and without data loss. Prior experience with similar projects is highly advantageous.

    $776 (Avg Bid)
    $776 Avg Bid
    62 bids

    As a business, we are seeking assistance to clean up and standardize a signi...duplicate records: We are experiencing an issue with duplicated data. Your key task will be to effectively identify and remove these to ensure our records are unique and reliable. - Standardize data format: As the data is mixed, consistency in its format is vital for our operations. Efforts will be focused on uniformly formatting this data. The data volume expected to be addressed is between 1GB and 10GB. Ideal applicants for this project would have proven experience in data cleaning, especially with mixed data types, and a keen eye for detail to spot and address discrepancies. Familiarity with large-scale data handling is also beneficial. With your help, we aim to streamline our datasets for more effi...

    $174 (Avg Bid)
    $174 Avg Bid
    34 bids

    ...hyper-v switch manager to add these to the CHR and then access can be granted directly via winbox to the mkt CHR TASKS 1) "Basic" instalation and configuration of CHR (cloud hosted router) on my pc running on hyper-v vm using my 4 port NIC 2.5GbE. "Basic" = internet gateway + dhcp server. 2) Bypass ISP blocked port 80 and 443 using Mikrotik CHR on a hyper-v VM For port 80,443 bypass we would prefer to use cloudflare tunnel The configuration will be like the diagram: the ISM modem (1) will connect to a 4 port ethernet NIC on my pc running CHR and from there it will connect to my current Mikrotik to connect to the other devices on my network. I can't disable DHCP on the ISP Modem, so the CHR will get the ip form the ISP modem but will use ...

    $41 / hr (Avg Bid)
    $41 / hr Avg Bid
    8 bids

    ...IMMEDIATELY (ANYDESK) AND COMPLETED WITH IN 1-2 HOURS OF ACCESS TIME. TASKS 1) I need to transfer my current Mikrotik RB4011Igs+ (ports are all GB ) configuration to a CHR (cloud hosted router) on my pc running on hyperv vm using my 2.5GbE 4 port NIC so I can have increased speed on my router. 2nd task is optional. 2) Bypass ISP blocked port 80 and 443 using Mikrotik CHR on a hyper-v VM For port 80,443 bypass we would prefer to use cloudflare The configuration will be like the diagram: the ISM modem (1) will connect to a 4 port ethernet NIC on my pc running CHR and from there it will connect to my current Mikrotik to connect to the other devices on my network. I can't disable DHCP on the ISP Modem, so the CHR will get the public ip form the ISP modem but will ...

    $38 / hr (Avg Bid)
    $38 / hr Avg Bid
    10 bids

    TASK TO BE EXECUTED IMMEDIATELY (ANYDESK) AND COMPLETED WITH IN 1-2 HOURS OF ACCESS TIME. TASKS 1) I need to transfer my current Mikrotik RB4011Igs+ (ports are all GB ) configuration to a CHR (cloud hosted router) on my pc running on hyperv vm using my 4 port NIC 2.5GbE. 2) Bypass ISP blocked port 80 and 443 using Mikrotik CHR on a hyper-v VM For port 80,443 bypass we would prefer to use cloudflare The configuration will be like the diagram: the ISM modem (1) will connect to a 4 port ethernet NIC on my pc running CHR and from there it will connect to my current Mikrotik to connect to the other devices on my network. I can't disable DHCP on the ISP Modem, so the CHR will get the public ip form the ISP modem but will use its own DHCP server as it is currently config...

    $39 / hr (Avg Bid)
    $39 / hr Avg Bid
    9 bids

    I require an expert who can efficiently migrate all emails from one cPanel account to another with a new web host. We're going from Namecheap (cPanel) to a new host with cPanel. I can provide logins to both. You don't have to worry about any DNS changes. The website is already set up on the new web host. The emails are around 10GB. I need this started and completed today. Thank you.

    $50 (Avg Bid)
    $50 Avg Bid
    17 bids

    TASK TO BE EXECUTED IMMEDIATELY (ANYDESK) AND COMPLETED WITH IN 1-2 HOURS OF ACCESS TIME. 1) I need to transfer my current Mikrotik RB4011Igs+ (ports are all GB ) configuration to a CHR (cloud hosted router) on my pc running on hyperv vm using my 4 port NIC 2.5GbE. 2) Bypass ISP blocked port 80 and 443 using Mikrotik CHR on a hyper-v VM For port 80,443 bypass we want to use cloudflare The configuration will be like the diagram: the ISM modem (1) will connect to a 4 port ethernet NIC on my pc running CHR and from there it will connect to my current Mikrotik to connect to the other devices on my network. I can't disable DHCP on the Modem, so the CHR will get the ip form the ISP modem but will use its own DHCP server as it is currently configured on the RB4011. - I...

    $36 / hr (Avg Bid)
    $36 / hr Avg Bid
    2 bids

    I'm in need of a tool, similar to chatgpt, capable of producing output from large, bulky PDF files. This tool will be used in a Windows operating system and should ideally be able to handle between 1GB and 10GB of data efficiently. Key features should include: - Output production: This is the main functionality. The tool should extract and present data from the PDF files in a clear manner. - Search function: The ability to conduct an in-depth search within the bulk PDFs is vital. - Summary function: This feature should summarize the PDF data, making it easier for me to review and understand the content without opening each file. Ideal candidate will have extensive skills in programming, AI, data extraction and processing. Experience with similar projects is highly desir...

    $490 (Avg Bid)
    $490 Avg Bid
    2 bids

    I'm seeking a highly skilled database administrator to perform a significant Oracle to MySQL database migration. My existing database is extensive, well above 10GB, and encompasses structural components including schemas, data, stored procedures and triggers. Key elements of the project include: 1. Migrating Schema/Data: Careful movement of all schema definitions and data is essential to ensuring seamless transition without loss. 2. Transferring Stored Procedures: Stored procedures currently in place within the Oracle database will need to be assessed and recreated effectively in the MySQL environment. 3. Triggers Implementation: All triggers in use within your Oracle setup will also require migration. While the size of this project is substantial, there's no need ...

    $1071 (Avg Bid)
    $1071 Avg Bid
    27 bids

    I am seeking a skilled data analyst to provide insights on customer demographic data, sales data, and website traffic data for my business. Key tasks include: - Handling and analyzing a medium-sized data volume of 10mb to 10GB - Utilizing data visualization, statistical analysis, and predictive modeling to inform strategic business decisions - Ensuring data analysis is precise and leads to constructive business improvements Ideal skills to have for this job: - Expertise in data analysis and interpretation - Advanced skills in data visualization tools and statistical software - Experience with predictive modeling - Understanding of customer demographics, sales data, and website traffic analysis This project will require you to effectively communicate complex data in a simple,...

    $20 / hr (Avg Bid)
    $20 / hr Avg Bid
    29 bids

    I'm seeking a skilled data analyst to help with tasks including data cleaning and preprocessing, data visualization and reporting, and statistical analysis. The data they will be working with is of medium size, ranging between 1GB to 10GB. Prior experience with Python, SQL, and Excel is crucial for this project. Specifically, proficiency with the Python libraries Pandas and Matplotlib is required for the analytics components of the work. This project requires both strong technical skills and a thorough understanding of data analysis methodologies.

    $735 (Avg Bid)
    $735 Avg Bid
    59 bids

    I'm looking to set up a new AWS server for my business. Currently, my website, email, and database are run...amount of data in my database is large, more than 50GB. Key tasks involved: - Setting up a new AWS server - Migrating all data, inclusive of Website, Email, and Database from GoDaddy to AWS - Ensuring the smooth and efficient transfer of data with no data loss Ideal Skills: - Proven expertise in AWS server setup and data migration - Proficient in handling and migrating large data (More than 10GB) - Detailed knowledge about securing the server and its data Please note that the preferred AWS region hasn't been decided yet, so the ability to advise on the best choice would be advantageous. Looking forward to your proposals. Please include an estimated timeline an...

    $218 (Avg Bid)
    $218 Avg Bid
    37 bids

    I'm in search of a highly-skilled machine learning expert to facilitate data preprocessing, model selection, and feature engineering tasks. Alongside these duties, you should also conduct a thorough literature review relative to the assigned tasks. Fluency in Python is obligatory for this role. The candidate should be comfortable dealing with medium-sized datasets (from 1GB to 10GB). I would be partial to someone with experience in handling such data and is proficient in implementing machine learning tasks within Python’s ecosystem.

    $178 (Avg Bid)
    NDA
    $178 Avg Bid
    40 bids

    I am a data scientist student and who is in need of professional assistance for a data science ...preprocessing, analysis, visualization, and machine learning modeling. I am looking for proficient hands-on individuals that can handle these tasks. - Programming Skills: The ideal freelancer should be proficient with Python and R, as these are the programming languages being used for the project. - Data Handling: The project involves dealing with a medium-sized data set, ranging between 1GB and 10GB. Therefore, the individual should be comfortable handling, manipulating, and analyzing data of this size. I am eagerly anticipating the collaborative synergy of a freelancer possessing excellent analytical thinking, attention to detail, and strong problem-solving abilities to advance thi...

    $296 (Avg Bid)
    $296 Avg Bid
    12 bids

    ...on improving its performance through Indexing, Query Optimization, and Data Caching. Key Details: - Database Size: My database is quite large with a data volume exceeding 10GB. This calls for keen attention to detail and experience working with large data volumes. - Store Procedures: There are multiple store procedures currently facing performance issues, requiring vital optimization to function at peak performance. Ideal Skills and Experience: - Extensive knowledge and experience in SQL Server DB management, indexing, query optimization, and data caching. - Proven experience working on large databases of over 10GB. - Ability to troubleshoot and optimize multiple store procedures. - Strong communication skills to provide guidance on database administration for faster...

    $34 / hr (Avg Bid)
    $34 / hr Avg Bid
    39 bids

    I'm in urgent need of a competent Tableau expert who is proficient in tasks like data cleaning and wrangling, performing exploratory data analysis and also doing statistical modeling and forecasting. The dataset is of medium size, ranging between 1GB and 10GB, so experience managing and manipulating data of this scale is crucial. Key responsibilities shall include: - Executing data cleaning and wrangling to ensure high-quality data - Conducting exploratory data analysis to identify patterns, trends and outliers - Using statistical models to forecast future scenarios Ideally, you'll deliver the data interpretation through various visualization tools including, but not limited to, charts and graphs, maps and geographic data and infographics. Successful candidates must ...

    $29 / hr (Avg Bid)
    $29 / hr Avg Bid
    33 bids

    I am seeking a skilled freelancer with experience in database migration, specifically with SQL Server, to help my organization reduce operational costs through the strategic movement of our large database, which is over 10GB. This project aims not just to move data, but to do so in a way that leads to a significant reduction in costs without compromising on performance or scalability. **Requirements:** - Expertise in SQL Server and knowledge of cost-effective database solutions. - Experience in migrating large databases (10GB or more) with minimal downtime. - Ability to optimize the migration process for cost without sacrificing quality or data integrity. - Capacity to provide strategic recommendations for hosting solutions that offer reduced costs. - Provide a clear plan f...

    $1773 (Avg Bid)
    $1773 Avg Bid
    48 bids

    ...net/dataset/english-prescribing-data-epd The website link also includes examples how CKAN API or Python can be used to extract relevant data if you click "Explore" and then data API. The relevant medication codes (BNF_Code) are contained in the attached word document. I would expect a CSV file for each month, with the following variables for each PRACTICE_CODE: ITEMS, QUANTITY, TOTAL QUANTITY, ADQUSAGE, NIC, ACTUAL_COST Ideal candidate skills: - Proficiency in Python programming. - Experience with CKAN API or similar data APIs. - Ability to work with large datasets. - Strong data extraction and manipulation skills. Project requirements: - Identify an effective method, either through CKAN API or Python, to extract pertinent data. - The specific data fields to be ...

    $289 (Avg Bid)
    $289 Avg Bid
    34 bids

    ...net/dataset/english-prescribing-data-epd The website link also includes examples how CKAN API or Python can be used to extract relevant data if you click "Explore" and then data API. The relevant medication codes (BNF_Code) are contained in the attached word document. I would expect a CSV file for each month, with the following variables for each PRACTICE_CODE: ITEMS, QUANTITY, TOTAL QUANTITY, ADQUSAGE, NIC, ACTUAL_COST Ideal candidate skills: - Proficiency in Python programming. - Experience with CKAN API or similar data APIs. - Ability to work with large datasets. - Strong data extraction and manipulation skills. Project requirements: - Identify an effective method, either through CKAN API or Python, to extract pertinent data. - The specific data fields to be ...

    $120 (Avg Bid)
    $120 Avg Bid
    12 bids

    I need a data analyst for my medium-sized dataset, between 1GB and 10GB, derived from a survey. The scope of work will involve: - Data cleaning and preprocessing to ensure data quality and readiness for analysis. - Data visualization utilizing relevant and informative plots and charts. It's paramount to be able to communicate the results clearly and visually. - Statistical analysis to provide insights, trends, patterns or any relevant findings in the data. Fluency in data cleaning, visualization tools, and statistical software is expected. Previous experience with handling survey data would be a major plus. The goal is to extract meaningful information from the data that will aid in decision making.

    $54 / hr (Avg Bid)
    $54 / hr Avg Bid
    45 bids

    I'm looking to hire an experienced and meticulous professional for the task of deduplicating large JSON data sets of size between 1GB - 10GB. The main aim here is to enhance data quality, so attention to detail is key. Below are some of the skills and experiences that would place you in the ideal position for the job: - Proficiency in JSON and familiarity with large data sets - Strength in data pre-processing for better quality and structure - Good track record of similar engagements with quality output Interested freelancers are encouraged to include their past work, experience and detailed project proposals in their application. Your understanding of the task and the clear articulation of how you intend to carry it out will go a long way in determining our choice of freelan...

    $61 (Avg Bid)
    $61 Avg Bid
    21 bids

    ...stress testing a PostgreSQL database. The script must effectively challenge our system with a medium-sized data scale (1-10GB). Key Requirements: - Proficient in scripting for PostgreSQL databases - Experience with pgbench or equivalent benchmarking tools - Capable of designing tests to handle medium-scale data volume - Understands the nuances of stress testing in database environments Ideal Skills: - PostgreSQL expertise - Proficiency in database scripting - Knowledge of performance testing best practices - Strong analytical and problem-solving abilities Outcome Expectations: - Deliver a robust pgbench script - Ensure the script can induce stress on a system with 1-10GB of data - Provide documentation on script usage and expected test outcomes By aiding in this projec...

    $67 (Avg Bid)
    $67 Avg Bid
    4 bids