23 Hot Programming Trends’ And 23 Going Cold
This is a comprehensive list of hot programming trends, and those that are declining in their popularity.
I have a webapplication (Reporting tool) built on Java Spring boot (hosted on amazon AWS). I need help integrating this application with kerberos-enabled sandbox. * I want help deploying Hortonworks sandbox on Microsoft Azure/AWS () * From the Application (which is on a different server) we should be able to connect to kerberos-enabled Hadoop on Azure/AWS through KEYTAB
I have a webapplication (Reporting tool) built on Java Spring boot (hosted on amazon AWS). I need help integrating this application with kerberos-enabled sandbox. * I want help deploying Hortonworks sandbox on Microsoft Azure/AWS () * From the Application (which is on a different server) we should be able to connect to kerberos-enabled Hadoop on Azure/AWS through KEYTAB
I have a webapplication (Reporting tool) built on Java Spring boot (hosted on amazon AWS). I need help integrating this application with kerberos-enabled sandbox. * I want help deploying Hortonworks sandbox on Microsoft Azure/AWS () * From the Application (which is on a different server) we should be able to connect to kerberos-enabled Hadoop on Azure/AWS through KEYTAB
Video Tutorial on Big Data Hadoop. It would be screen recording and voice over. The recording will be approx 7 - 8 hrs
Research goal: 1)My idea is to employ a machine learning technique to optimize the number of data streams for a given protocol in an adaptive manner to achieve the highest throughput. 2)How to improve...number of data streams, monitor host/network condition, and measure the corresponding transport performance. Note that TCP fairness may need to be considered for some UDP-based protocols if running on the internet. b) Select an appropriate machine learning algorithm(could start from a neural network), and train the model using the data collected at Step a) c) Build the model, measurement data, use Hadoop MapReduce / Spark for data analysis. Some existing machine learning modules in big data systems could be used. d) Once the model training is completed, apply the model for onlin...
Need Interactive dashboard application using Spark and zeppelin Notebook. Apply filter on three columns and display results in zeppelin Notebook.
Por favor, regístrate o inicia sesión para ver los detalles.
Research goal: 1)My idea is to employ a machine learning technique to optimize the number of data streams for a given protocol in an adaptive manner to achieve the highest throughput. 2)How to improve...number of data streams, monitor host/network condition, and measure the corresponding transport performance. Note that TCP fairness may need to be considered for some UDP-based protocols if running on the internet. b) Select an appropriate machine learning algorithm(could start from a neural network), and train the model using the data collected at Step a) c) Build the model, measurement data, use Hadoop MapReduce / Spark for data analysis. Some existing machine learning modules in big data systems could be used. d) Once the model training is completed, apply the model for onlin...
I have a request for a Big Data Architect with relocation to Netherland till 31st December, with possible extension. Flights and Accommodation are paid. Tasks ...security and legal/compliance teams · Design system architectures for real-time cost-effective data processing and routing · Proactively identify techniques that make our data assets easier to use or find · Democratise access to information assets for business users Required skills · 5 years of experience as a Big Data Architect · Experience or profound interest in Big Data technologies (Hadoop, Spark, Elasticsearch, Kafka, ...) · Knowledge of DAMA/DMBOK, TOGAF, TMForum is a plus Start: 1st of September 2017 Duration: 31st of December 2017 with possible extensions Wor...
Hello. I would like to hire a technical writer who can write project report. Topic is Sap and hadoop. We wish that Technical writer should have knowledge and experience on Sap and Hadoop. And should have experience in writing Project report. (Removed by Freelancer.com Admin)
good knowledge of Big data and hadoop is required. should be able to talk properly in english, proper communiation is required. need to do 1 project entirely using the above technologies having good hands on these technologies every week needs some progress in the project.
Hi looking freelancer trainer who can train our online batches on bigdata or hadoop, please only experience person apply.
Looking for a Hadoop Admin. Someone who is good at installation and know how to tutor students. .Looking for someone who can teach New York Night time.. Please don't waste my time and your time, if you are not serious.. There are almost 5 students. This will be a 4 hours a week and 16 hours a month..
Hi I have some problems running Map/Reduce commands on Hadoop and need some help with it. Thanks Alex
Hello we looking for freelancer trainers for puppet ,python , Amazon web services, dev ops, hadoop admin if anyone have prior experience in online instructor led training please apply.
.ssist with our Tableau 9 to Tableau 10 migrati...-Establish and validate data source extract refreshes -Assist with migrating production server when development server migration is complete Notes from our manager call: The person will be publishing version 10 workbooks based on existing version 9 workbooks, so they need to make sure the workbooks function in 10 the same as they did in 9 The data sources for the workbooks are in a Hadoop Hive so experience with Hadoop would be nice The team is 8 people the manager described as really smart and led by experienced SMEs and analysts; so the selected person should be able to take direction well The key is to understand workbooks, have experience with publishing both 9 & 10 workbooks, and take direction well. Not much progra...
I am looking for someone who has an experience on HDFS, YARN, HBase, OpenTSDB as part of our Hadoop setup and if you can help us to configure our database since our current DB is slow . Thanks
I need a Hadoop developer for job support
I need a Hadoop developer for job support
Por favor, regístrate o inicia sesión para ver los detalles.
I have some problems running some commands on hadoop. The following command gives me error: : No such file or directory [cloudera@quickstart ~]$ echo "this is a test and this should count the number of words" | /home/cloudera/homework5/ | sort -k1,1 | /home/cloudera/homework5/
Looking for Tutors in Hadoop, Cloud Computing and Mobile Application Development
Talented IT professional with twelve years of experience in design, development on Big Data Hadoop,Spark, Kafka,No SQL, Data engineering platform and Microsoft platforms with object oriented, scripting languages. As big data technical leader has knowledge of multiple scripting and programming languages. Possesses strong abilities in big data environments and is extremely analytical with excellent problem-solving. Proven ability to troubleshoot and solve complex problems, lead small project groups. Demonstrated initiative to learn new technologies and exploration. Strong sense of "do it right the first time", meticulous attention to detail, and ability to communicate technical knowledge to peers and support personnel. Having experience in complete SDLC and handled build an...
need hadoop expert who is good at configuring knox sso for ranger zeppelin nad ambari
Need two emails with following description Write a mail to training and placement cell of IT engineering colleges. In this mail, describe the benefits of hadoop big data course to the tpo and ask tpo to recommend this course to their student to get a good chance of placement. Email for "internship for MBAs" Title - Marketing communication for startup, they have to spread word of mouth about the startup company courses to the IT students at colleges and outside the IT training institutes
Needed expert for Perl Training E-learning Content Development. It would be screen recording and voice over. The recording will be approx 7 - 8 hrs and will cover Golang, Django, Big Data Hadoop, Puppet from basics to advanced. Each course is separate and while you bid pls mention the course. Bid price is for single course
Need someone to help to explain the real time flow of the project. One should have the good experience on spark and hadoop. It is good to have project domain on any one of the following banking,retail,finance,healthcare.
...including HIVE SQL for Hadoop/Big Data platforms. • 5+ Years of dashboard and report development experience for Data Quality • Proven experience designing, developing, and implementing dashboards and reports using SAS Visual Analytics • Reporting (included in the SAS Data Quality Advanced bundle DM 2.7) • Proven experience working with the SAS LASR platform (included in the SAS Data Quality Advanced bundle DM 2.7) • Experience executing data profiling using DQ solutions such as SAS DataFlux, IBM QualityStage, Informatica, or Podium • Experience designing, developing, and implementing DQ validation rules using solutions such as SAS DataFlux, IBM QualityStage or Podium to inspect and monitor the health of the data • Experience working with B...
$35 for each article Total $350 10 articles each article should have around 350 to 360 words. For First Article details are given below. We need to focus on the keywords related to keywords are "qa testing training" , "selenium webdriver training" , "DevOps Training" , Java Training, .net training, Business analyst training, bigdata / hadoop training. Website : For California: California Fremont, San Jose, Pleasanton, Los Angeles, Cupertino, San Ramon, Sacramento, Yuba City, Irvine in Southern California, San Francisco, Sunnyvale, Union City -- Check the keywords related to software QA Testing training and job placement , "Selenium webdriver automation testing training and job placement" ============ I will update the details for r...
I want small project with Kafka and hadoop spark streaming , can you please anyone explain me with details coding
ADMIN TASKS : Cluster Installation , Configuration & Administration Cluster Planning & Cluster Maintenance. Resource Management. Monitoring & Logging Security Management. [Kerberos Authentication & Apache Sentry Authorization] Troubleshooting. Cluster Monitoring. Backup And Recovery Task. Disaster Recovery Hive, Pig, Oozie, Impala, MapReduce, Yarn, Ganglia, Cloudera manager, HBase & Spark
We are online IT Training providers. We need very good technical content writers and article writers to write technical blogs for and on different technologies. Person must have good technical skills about software QA Testing, Selenium automation testing, Java, .net , bigdata / hadoop, machine learning and other latest technologies. Content / article writing should be SEO Standards for search engine. We need the work for several months. I need quote on per word. We need at least 15 to 20 articles per month. Please check the websites and If you are right person then only apply for this project. It is waste of time for you and me if you are not able to write good content. We dont want copied content. We have SEO
Hadoop Administration for 100 hours (44T + 56L) in PG-DHPCSA, ACTS, Pune
Hello, I'm looking for a graphic designer who can help me redesign my existing website to something amazing. The area focus of my business is open source Hadoop machine learning and artificial intelligent.
website need to be build with few pages Page1: which tells about the list of courses(Example : Hadoop Training, Spark Training and Java Taining) which we provide and including topics which we are going to cover in training and should display the up coming batches and user can able to enroll for the training . This page should have the option for customers to send the queries Example : If user click on Hadoop Training , which should display the about list the topics which are going to cover in the training and training start date Page2: This is about the blog, where admin can post the topics on daily basis and should display the recents posts also. User should be able to post the their comments on topics which addded by the admin(admin should have the login page fo...
hi folks, i am looking for a person who is extremely good at linux administration(sssd ,PAM authentication ways) and hadoop security (like kerberos ,ranger ,knox ssl/tls must )and ldap integrations with knox ranger zepplin.
Por favor, regístrate o inicia sesión para ver los detalles.
I am i need of Hadoop Trainer /Android Trainer/Selenium Trainer to conduct seminar or workshop for college students on hourly/day 3 days only 3 hours of training .Good communication with excellent simplified concepts skills are required.E books and software installed device is candidates contact me using the given [Removed by Freelancer.com Admin]
Attractive commission guaranteed.I need some help with selling e learning course in very niche domain of Big Data technology. Also known as Cloudera Hadoop Administration for Apache Hadoop. You can find e learning course at
I need you to write a research article. Big Data hadoop
Need someone to help to explain the real time flow of the project. One should have the good experience on spark and hadoop. It is good to have project domain on any one of the following banking,retail,finance,healthcare.
I have a java file, I need to connect that file with other dependencies which I will explain it in the chat. This project will require knowledge of Hadoop Ecosystem and oracle as well.
software testing web testing map reduce Hadoop
need someone with knowledge of Hadoop, map reduce, sql, python, data mining
The MTBF of a Hadoop node and an enterprise storage server is three and six years respectively [6], [20]. Based on these values, Table I compares the probability of data loss per day in HDFS and NAS filers using Equations 3. We compare the HDFS cluster with 1000 nodes to a filer with 100 nodes assuming that they can offer the same storage capacity. This is valid assumption given recent trends in storage capacity. Enterprise systems can easily support more than 240 TB of storage [24], while a typical Hadoop node has 12 TB to 24 TB of storage. Filers can offer probability of data loss with one replica of 4.4 ∗ 10−2 , compared to HDFS with 3 replicas, which has a 6.44 ∗ 10−2 probability of data loss. Given the high number of disks in a sin...
The MTBF of a Hadoop node and an enterprise storage server is three and six years respectively [6], [20]. Based on these values, Table I compares the probability of data loss per day in HDFS and NAS filers using Equations 3. We compare the HDFS cluster with 1000 nodes to a filer with 100 nodes assuming that they can offer the same storage capacity. This is valid assumption given recent trends in storage capacity. Enterprise systems can easily support more than 240 TB of storage [24], while a typical Hadoop node has 12 TB to 24 TB of storage. Filers can offer probability of data loss with one replica of 4.4 ∗ 10−2 , compared to HDFS with 3 replicas, which has a 6.44 ∗ 10−2 probability of data loss. Given the high number of disks in a sin...
The MTBF of a Hadoop node and an enterprise storage server is three and six years respectively [6], [20]. Based on these values, Table I compares the probability of data loss per day in HDFS and NAS filers using Equations 3. We compare the HDFS cluster with 1000 nodes to a filer with 100 nodes assuming that they can offer the same storage capacity. This is valid assumption given recent trends in storage capacity. Enterprise systems can easily support more than 240 TB of storage [24], while a typical Hadoop node has 12 TB to 24 TB of storage. Filers can offer probability of data loss with one replica of 4.4 ∗ 10−2 , compared to HDFS with 3 replicas, which has a 6.44 ∗ 10−2 probability of data loss. Given the high number of disks in a sin...
This is a comprehensive list of hot programming trends, and those that are declining in their popularity.
Thinking of becoming a data scientist? Here are 9 skills that you should look at adopting.