Search Results

Search found 110151 results on 4407 pages for 'real time data integratio'.

Page 119/4407 | < Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >

  • Why can't the IT industry deliver large, faultless projects quickly as in other industries?

    - by MainMa
    After watching National Geographic's MegaStructures series, I was surprised how fast large projects are completed. Once the preliminary work (design, specifications, etc.) is done on paper, the realization itself of huge projects take just a few years or sometimes a few months. For example, Airbus A380 "formally launched on Dec. 19, 2000", and "in the Early March, 2005", the aircraft was already tested. The same goes for huge oil tankers, skyscrapers, etc. Comparing this to the delays in software industry, I can't help wondering why most IT projects are so slow, or more precisely, why they cannot be as fast and faultless, at the same scale, given enough people? Projects such as the Airbus A380 present both: Major unforeseen risks: while this is not the first aircraft built, it still pushes the limits if the technology and things which worked well for smaller airliners may not work for the larger one due to physical constraints; in the same way, new technologies are used which were not used yet, because for example they were not available in 1969 when Boeing 747 was done. Risks related to human resources and management in general: people quitting in the middle of the project, inability to reach a person because she's on vacation, ordinary human errors, etc. With those risks, people still achieve projects like those large airliners in a very short period of time, and despite the delivery delays, those projects are still hugely successful and of a high quality. When it comes to software development, the projects are hardly as large and complicated as an airliner (both technically and in terms of management), and have slightly less unforeseen risks from the real world. Still, most IT projects are slow and late, and adding more developers to the project is not a solution (going from a team of ten developer to two thousand will sometimes allow to deliver the project faster, sometimes not, and sometimes will only harm the project and increase the risk of not finishing it at all). Those which are still delivered may often contain a lot of bugs, requiring consecutive service packs and regular updates (imagine "installing updates" on every Airbus A380 twice per week to patch the bugs in the original product and prevent the aircraft from crashing). How can such differences be explained? Is it due exclusively to the fact that software development industry is too young to be able to manage thousands of people on a single project in order to deliver large scale, nearly faultless products very fast?

    Read the article

  • Prevalence of WMI enabled in real Windows Server networks

    - by TripleAntigen
    Hi I would like to get opinions from systems administrators, on how common it is that WMI functionality is actually enabled in corporate networks. I am writing an enterprise network application that could benefit from the features of WMI, but I noted after creating a virtual network based on Server 2008 R2, that WMI seems to be disabled by default. Do systems admins in practical corporate networks enable WMI? Or is it usually disabled for security purposes? What is it used for if it is enabled? Thanks for any advice! MORE INFO: I should have said, I really need to be able to query the workstations but I understand that by default the WMI ports on Win7 and XP firewalls (at least) are disallowed, so do you use some sort of group policy or other method to leave a hole open for WMI on the workstations? Or is just the servers that are of interest? Thanks for the responses!!

    Read the article

  • Oracle and ROLTA: Collaboration for Analytical Master Data Management

    - by Mala Narasimharajan
    Oracle and ROLTA have joined forces to put together an educational webinar series on best practices for maximizing data integrity using analytical master data management.  Hear replays of webcasts by Gartner as well as customer success at Navistar and learn how Master Data Management in the enterprise is the right choice for heterogeneity, data degradation and improved analysis of your business. For more information on this collaboration click here. For additional information on Oracle's solution suite for MDM, click here. 

    Read the article

  • better media player than vlc with a real library [closed]

    - by elluca
    hi i have been using VLC for years. but my library of movies is growing an i would like to have a better management of them. sure, i could use itunes for that. but i really hat quicktime. the UI is even worse than the one windows media player got. i looked at winamp, windows media player and songbird. but none has got a UI comparable to VLC. the most important feature: hotkeys. i want to be able to skip in a video using the mouse scroll wheel. do i have to code one myself? ;) thanks lukas

    Read the article

  • BIG DATA eBook - Now Available

    - by Javier Puerta
    The Big Data interactive e-book “Meeting the Challenge of Big Data: Part One” has just been released. It’s your “one-stop shop” for info about Big Data and the Oracle offering around it.The new e-book (available on your computer or iPad) is packed with multi-media resources to educate Oracle staff, customers, prospects and partners on the value of Big Data. It features videos, tutorials, podcasts, reports, white papers, datasheets, blogs, web links, a 3-D demo, and more. Go and get it here!

    Read the article

  • Will KeywordSpy Really Save You Time?

    Trying to find the best keywords for your projects has proven to be one of the most tedious chores related to Internet marketing. The object of the keyword is to make sure that people who are searching your product will stay on track. These are what allow people to come across your website. If you do not have much free time and are too busy, locating the right keyword can be a difficult as well as time consuming task.

    Read the article

  • How to make a PHP function triggered automatically at a user defined time

    - by mithilatw
    I am developing an internal system for a company with PHP using Zend framework. I need one of its functions to execute on a time specified by user. My research on this matter found me several ways of doing this using CPanel Cron jobs and setting up scheduled tasks on the server. But in this scenario, I don't have a CPanel and I already use scheduled tasks. But my challenge is to provide an interface for the user to specify the time to trigger the function.

    Read the article

  • Logging every time a command is run

    - by Tom D
    I want to log every time I run a certain type of command in the terminal. For example, every time I run: sudo apt-get install [something] I want to add [something] to a log file in my home directory that will look like the following: [timestamp] [something] 2012-10-02 mysql-server 2012-10-03 ruby1.9.1 2012-10-06 gedit-plugins 2012-10-07 gnome-panel synaptic What's the easiest way to make this happen automatically?

    Read the article

  • Contiguous Time Periods

    It is always better, and more efficient, to maintain referential integrity by using constraints rather than triggers. Sometimes it is not at all obvious how to do this, and the history table, and other temporal data tables, presented problems for checking data that were difficult to solve with constraints. Suddenly, Alex Kuznetsov came up with a good solution, and so now history tables can benefit from more effective integrity checking. Joe explains...

    Read the article

  • Chapter 5: From 2005 to 2010: Business Logic and Data

    After reading this chapter, you will be able to Use the Entity Framework (EF) to build a data access layer using an existing database or with the Model-First approach .Generate entity types from the Entity Data Model (EDM) Designer using the ADO.NET Entity Framework POCO templates. Get data from Web services Learn about data caching using the Microsoft Windows Server AppFabric (formerly known by the codename “Velocity”)

    Read the article

  • Real-time Image Resize, Cropping and Caching Server Product

    - by Elijah
    I'm investigating what products are out there that will allow you to request images through a HTTP API in arbitrary image sizes. The server would behind a CDN but would still need to be able to handle a fair bit of traffic and be possibly load-balanced. I've been tasked with writing such a service, but I wanted to do some due diligence to see what commercial or open source solutions are out there. Google has not been particularly helpful. It may be because I have been searching for the wrong term. Third-party sites and services are out of the question because of corporate policies.

    Read the article

  • Come up with a real-world problem in which only the best solution will do (a problem from Introduction to algorithms) [closed]

    - by Mike
    EDITED (I realized that the question certainly needs a context) The problem 1.1-5 in the book of Thomas Cormen et al Introduction to algorithms is: "Come up with a real-world problem in which only the best solution will do. Then come up with one in which a solution that is “approximately” the best is good enough." I'm interested in its first statement. And (from my understanding) it is asked to name a real-world problem where only the exact solution will work as opposed to a real-world problem where good-enough solution will be ok. So what is the difference between the exact and good enough solution. Consider some physics problem for example the simulation of the fulid flow in the permeable medium. To make this simulation happen some simplyfing assumptions have to be made when deriving a mathematical model. Otherwise the model becomes at least complex and unsolvable. Virtually any particle in the universe has its influence on the fluid flow. But not all particles are equal. Those that form the permeable medium are much more influental than the ones located light years away. Then when the mathematical model needs to be solved an exact solution can rarely be found unless the mathematical model is simple enough (wich probably means the model isn't close to reality). We take an approximate numerical method and after hours of coding and days of verification come up with the program or algorithm which is a solution. And if the model and an algorithm give results close to a real problem by some degree that is good enough soultion. Its worth noting the difference between exact solution algorithm and exact computation result. When considering real-world problems and real-world computation machines I believe all physical problems solutions where any calculations are taken can not be exact because universal physical constants are represented approximately in the computer. Any numbers are represented with the limited precision, at least limited by amount of memory available to computing machine. I can imagine plenty of problems where good-enough, good to some degree solution will work, like train scheduling, automated trading, satellite orbit calculation, health care expert systems. In that cases exact solutions can't be derived due to constraints on computation time, limitations in computer memory or due to the nature of problems. I googled this question and like what this guy suggests: there're kinds of mathematical problems that need exact solutions (little note here: because the question is taken from the book "Introduction to algorithms" the term "solution" means an algorithm or a program, which in this case gives exact answer on each input). But that's probably more of theoretical interest. So I would like to narrow down the question to: What are the real-world practical problems where only the best (exact) solution algorithm or program will do (but not the good-enough solution)? There are problems like breaking of cryptographic ciphers where only exact solution matters in practice and again in practice the process of deciphering without knowing a secret should take reasonable amount of time. Returning to the original question this is the problem where good-enough (fast-enough) solution will do there's no practical need in instant crack though it's desired. So the quality of "best" can be understood in any sense: exact, fastest, requiring least memory, having minimal possible network traffic etc. And still I want this question to be theoretical if possible. In a sense that there may be example of computer X that has limited resource R of amount Y where the best solution to problem P is the one that takes not more than available Y for inputs of size N*Y. But that's the problem of finding solution for P on computer X which is... well, good enough. My final thought that we live in a world where it is required from programming solutions to practical purposes to be good enough. In rare cases really very very good but still not the best ones. Isn't it? :) If it's not can you provide an example? Or can you name any such unsolved problem of practical interest?

    Read the article

  • Big Data Sessions at Openworld 2012

    - by Jean-Pierre Dijcks
    If you are coming to San Francisco, and you are interested in all the aspects to big data, this Focus On Big Data is a must have document.  Some (other) highlights: A performance demo of a full rack Big Data Appliance in the engineered systems showcase A set of handson labs on how to go from a NoSQL DB to an effective analytics play on big data Much, much more See you all in a few weeks in SF!

    Read the article

  • Today Will Be 11/11/11 – 11:11:11 for the First Time in 100 Years

    - by The Geek
    Today at 11:11 and 11 seconds, the date and time will be a perfect same-number palindrome—that is, it will read the same backwards and forwards, using the same number. It will be 11:11:11 on 11/11/11, and that won’t happen again for another 100 years. Naturally, it’ll happen again at 11 PM, for those that don’t observe military time. Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone How To Migrate Windows 7 to a Solid State Drive Follow How-To Geek on Google+

    Read the article

  • Real time audio streaming

    - by Josh K
    I have a remote computer running OS X. I would like to stream the audio from the microphone input over the network so I can listen to it. Primarily I want to do this because I'm out of the office but still need to communicate with people there. I would like to use VLC, but am not fully aware of the options available. I tried SoundFly (as recommended by another answer) but this didn't seem to want to connect. At this point I should note that I'm using a VPN network to connect to the remote computer (using Hamachi). I can open up ports / etc fine though, so I should be able to do this. Alright, I found Nicecase which does exactly what I want but I would prefer to not have to shell out $40 for it.

    Read the article

  • Google Analytics: Custom variables issue difference in data

    - by Bart
    We’ve set up tracking through custom variables in Google Analytics to measure which offices are getting the most traffic. The custom var consists out of the key (=office) and value = (office name). In our Custom Var tab in audience we get no data (actually we got 1 hit, but we think the data is way off). When we setup advanced segments with the filters on key and value we get the correct data. Now we are wondering why we aren’t getting that data in the custom var tab.

    Read the article

  • General programming techniques to speed up coding time

    - by mcwise
    I am preparing for a programming contest in C++ where it is all about producing working code in a short time. An example would be to use a macro to get the minimum of two ints(but I was told that you shouldn't use macros as they are not type-safe) or using memsets to initialize arrays (but I was told that you shouldn't use memsets in C++). This leads to the question, what kind of coding techniques exist to use at a real job?

    Read the article

  • Storing data on server [closed]

    - by Maciekp
    1.How am I supposed to store data on server, using not only: databases,text files and images? And how someone could implement storing data in fb's graph api http://developers.facebook.com/docs/reference/api/ , so when I go to: https://graph.facebook.com/19292868552 it shows me such data(how it can be stored? I guess it's not Mysql database) PS. Link to article: http://jayant7k.blogspot.com/2009/05/how-facebook-stores-billions-of-photos.html <- How can concurrent users writing requests be solved(while storing data in text file).

    Read the article

  • Transfer iptables rules to another server (almost) real time

    - by MrShunz
    I'm running 2 cPanel servers with ConfigServer Security & Firewall plugin. One of the functions of the plugin is to block via iptables (temporarily and/or permanently) IPs which fail various authentications (POP3/IMAP, SMTP, FTP, webmail, mod_security and such). Now, i'd like to push those IP blocks to the border router to drop packets as soon as possible (and doing so protecting the other machines on the network). Keep in mind that after N failed logins IP is blocked for 5 minutes, then re-allowed. If multiple bans occours in an hour IP is blocked permanently and should be unlocked "by hand". So I need a near realtime solution. What I'm looking for is a better way than firing some cronjobs both on cPanels and border router to: dump the rules to file transfer the file to border router (via scp/sftp) load the rules from the file in the border router I'm aware that I will need some scripts to parse and modify the rules as cPanels have one ethernet interface and some aliases while border router has two ehternet interfaces and some loopbacks. All machines involved use Linux. EDIT as per @pjmorse comment. The plugin consists of a bunch of perl and config files. The part I'm intrested in is a process which scans logfiles (lfd) and installs iptables rules (and sends an alert email). Fact is, it upgrades quite often (one or two times a week) and itself is 7000 lines of perl so I'm not comfortable on tampering with it.

    Read the article

  • Data Compare is Finally Back in VS 2012

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/07/01/data-compare-is-finally-back-in-vs-2012.aspxI’ve been missing the data compare tool this since moving from VS 2010. I’ve install the VS 2013 v3 update and then the SQL Server Data Tools - June 2013 update. I don’t think v3 is required, but it’s a good upgrade to do anyways. http://blogs.msdn.com/b/ssdt/archive/2013/06/24/announcing-sql-server-data-tools-june-2013.aspx

    Read the article

  • No Time for IT? Try Managed Services

    If maintaining your small business computer systems is a drag on your time and psyche, consider IT outsourcing. It frees up time, delivers better results, and a recent study shows it&#146;s more affordable than you might think.

    Read the article

< Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >