Search Results

Search found 9847 results on 394 pages for 'cloud backup'.

Page 159/394 | < Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >

  • Oracle OpenWorld 2012. Rueda de prensa de Mark Hurd: pilares de la estrategia de Oracle

    - by Fabian Gradolph
    Estamos en la segunda jornada de OpenWorld 2012. La sesión ha comenzado con la presentación de Mark Hurd, quien después de su intervención, ha ofrecido una rueda de prensa multitudinaria. Es probable que más de 80 o 90 periodistas estuviesen presentes en la sala. El presidente de Oracle nos ha presentado una muy buena síntesis sobre la estrategia de la compañía, que básicamente está basada en cuatro pilares. He aquí un breve resumen: 1. Ofrecer los mejores productos y soluciones en cada una de las capas tecnológicas. Es decir, los mejores productos de almacenamiento de información, los mejores servidores, la mejor base de datos… todos ellos basados en estándares de la industria, lo que favorece la interoperabilidad con los productos de otros fabricantes, si esa es la elección del cliente. 2. Integrar los productos desde la fase de diseño y desarrollo para ofrecer a los clientes soluciones que proporcionen un rendimiento extremo. Los mayores exponentes de esta estrategia son los Engineered Systems (sistemas de ingeniería conjunta), que incluyen productos estrella como Exadata, Exalogic, o Exalitycs. Al combinar las mejores capacidades de los productos Oracle e integrarlos verticalmente, se consiguen los siguientes beneficios. Rendimiento extremo de las soluciones tecnológicas. Por ejemplo, la última versión de Exadata, presentada ahora, ofrece una velocidad hasta 100 veces mayor que la anterior. Reducción de costes. Los sistemas integrados permiten reducir la necesidad de espacio, el tiempo de instalación e integración y los costes de mantenimiento. Su mayor rendimiento también se traduce en una menor necesidad de inversión en infraestructura. Simplificación de la gestión y el mantenimiento. Al integrar diferentes tipos de soluciones en una sola, también se simplifica la contratación de sistemas de soporte, centralizándolos en un único proveedor. 3. 3. Una propuesta completa para entornos cloud. La propuesta de Oracle incluye una cloud pública (Oracle Public Cloud), infraestructura para clouds privadas e infraestructura para sistemas híbridos. Así, la compañía ofrece Aplicaciones como Servicio (AaaS), Plataforma como servicio (PaaS) e infraestructura como servicio (IaaS). Al mismo tiempo, facilita soluciones para que las empresas construyan sus propias infraestructuras cloud. La ventaja de la propuesta de Oracle es que se utiliza la misma tecnología para la cloud pública o la privada, de forma que los clientes tengan facilidad para escoger qué aplicaciones mantendrán en un sistema público, cuáles en un sistema privado, etc. La máxima interoperabilidad permite, además, trasnferir aplicaciones de unas modalidades a otras sin problemas. 4. Aplicaciones sectoriales. Oracle está apostando fuertemente por el desarrollo de aplicaciones específicas para los diferentes sectores de actividad. Así, Oracle proporciona soluciones concretas para Banca y Finanzas, Distribución, Logística, Sector Público, Telecomunicaciones y un largo etcétera. Tras sus palabras preliminares hubo una interesante ronda de preguntas. No es posible hacer un resumen de todas ellas, pero me quedo con un mensaje que Mark ha repetido en un par de ocasiones: Oracle quiere seguir creciendo en todos los mercados en los que opera y se trata de una estrategia para crecer. Así lo esperamos todos.

    Read the article

  • Oracle Data Integration 12c: Simplified, Future-Ready, High-Performance Solutions

    - by Thanos Terentes Printzios
    In today’s data-driven business environment, organizations need to cost-effectively manage the ever-growing streams of information originating both inside and outside the firewall and address emerging deployment styles like cloud, big data analytics, and real-time replication. Oracle Data Integration delivers pervasive and continuous access to timely and trusted data across heterogeneous systems. Oracle is enhancing its data integration offering announcing the general availability of 12c release for the key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c, delivering Simplified and High-Performance Solutions for Cloud, Big Data Analytics, and Real-Time Replication. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. With the 12c release Oracle becomes the new leader in the data integration and replication technologies as no other vendor offers such a complete set of data integration capabilities for pervasive, continuous access to trusted data across Oracle platforms as well as third-party systems and applications. Oracle Data Integration 12c release addresses data-driven organizations’ critical and evolving data integration requirements under 3 key themes: Future-Ready Solutions : Supporting Current and Emerging Initiatives Extreme Performance : Even higher performance than ever before Fast Time-to-Value : Higher IT Productivity and Simplified Solutions  With the new capabilities in Oracle Data Integrator 12c, customers can benefit from: Superior developer productivity, ease of use, and rapid time-to-market with the new flow-based mapping model, reusable mappings, and step-by-step debugger. Increased performance when executing data integration processes due to improved parallelism. Improved productivity and monitoring via tighter integration with Oracle GoldenGate 12c and Oracle Enterprise Manager 12c. Improved interoperability with Oracle Warehouse Builder which enables faster and easier migration to Oracle Data Integrator’s strategic data integration offering. Faster implementation of business analytics through Oracle Data Integrator pre-integrated with Oracle BI Applications’ latest release. Oracle Data Integrator also integrates simply and easily with Oracle Business Analytics tools, including OBI-EE and Oracle Hyperion. Support for loading and transforming big and fast data, enabled by integration with big data technologies: Hadoop, Hive, HDFS, and Oracle Big Data Appliance. Only Oracle GoldenGate provides the best-of-breed real-time replication of data in heterogeneous data environments. With the new capabilities in Oracle GoldenGate 12c, customers can benefit from: Simplified setup and management of Oracle GoldenGate 12c when using multiple database delivery processes via a new Coordinated Delivery feature for non-Oracle databases. Expanded heterogeneity through added support for the latest versions of major databases such as Sybase ASE v 15.7, MySQL NDB Clusters 7.2, and MySQL 5.6., as well as integration with Oracle Coherence. Enhanced high availability and data protection via integration with Oracle Data Guard and Fast-Start Failover integration. Enhanced security for credentials and encryption keys using Oracle Wallet. Real-time replication for databases hosted on public cloud environments supported by third-party clouds. Tight integration between Oracle Data Integrator 12c and Oracle GoldenGate 12c and other Oracle technologies, such as Oracle Database 12c and Oracle Applications, provides a number of benefits for organizations: Tight integration between Oracle Data Integrator 12c and Oracle GoldenGate 12c enables developers to leverage Oracle GoldenGate’s low overhead, real-time change data capture completely within the Oracle Data Integrator Studio without additional training. Integration with Oracle Database 12c provides a strong foundation for seamless private cloud deployments. Delivers real-time data for reporting, zero downtime migration, and improved performance and availability for Oracle Applications, such as Oracle E-Business Suite and ATG Web Commerce . Oracle’s data integration offering is optimized for Oracle Engineered Systems and is an integral part of Oracle’s fast data, real-time analytics strategy on Oracle Exadata Database Machine and Oracle Exalytics In-Memory Machine. Oracle Data Integrator 12c and Oracle GoldenGate 12c differentiate the new offering on data integration with these many new features. This is just a quick glimpse into Oracle Data Integrator 12c and Oracle GoldenGate 12c. Find out much more about the new release in the video webcast "Introducing 12c for Oracle Data Integration", where customer and partner speakers, including SolarWorld, BT, Rittman Mead will join us in launching the new release. Resource Kits Meet Oracle Data Integration 12c  Discover what's new with Oracle Goldengate 12c  Oracle EMEA DIS (Data Integration Solutions) Partner Community is available for all your questions, while additional partner focused webcasts will be made available through our blog here, so stay connected. For any questions please contact us at partner.imc-AT-beehiveonline.oracle-DOT-com Stay Connected Oracle Newsletters

    Read the article

  • Build Your Own CE6 Kernel

    - by Kate Moss' Big Fan
    The Share Source Program in Windows CE provides many modules in %_WINCEROOT%\Private\ tree, and the kernel is one of them! Although it is not full source of kernel but it is good enough for tracing it, even tweak the kernel. Tracing the kernel and see how it works is lots of fun, but it is fascinated to modify and verify the change you made. So first comes first, where is the source of kernel? It's in your %_WINCEROOT%\private\winceos\COREOS\nk\ And next question will be "How do I build it?", Some of you may say just "build -c" there and it should be good. If you are the owner of kernel and got full source, that is definitely the right answer, but none of them are applied to our case though. So what should I do? Let's dig deeper into the coreos\nk folder, there are a couples of subfolder, CELOG, KDSTUB, KERNEL and etc. KERNEL\ is the main component of kernel.dll, in the other word, most of the modify to kernel is going to happen here. And the good thing is, you could "build -c" in %_WINCEROOT%\private\winceos\COREOS\nk\kernel\ with no error at all. But before doing that, remember to backup eveything you are going to modify, including the source and binaries; remember, this is not something belong to you, and if you didn't restore them back later, it could end up confuse the subsequence QFE updates! Here is the steps Backup the source code, I will suggest the whole %_WINCEROOT%\private\winceos\COREOS\nk\ Backup the binaries in common\oak\lib\, and again if you are not sure which files, backup the whole %_WINCEROOT%\common\oak\lib\ is the safest way. Do whatever modification you want in %_WINCEROOT%\private\winceos\COREOS\nk\kernel\ build -c in %_WINCEROOT%\private\winceos\COREOS\nk\kernel If everything went well so far, you should get a new nkmain.lib,nkmain.pdb, nkprmain.lib and nkprmain.pdb in %_WINCEROOT%\public\common\oak\lib\%_TGTCPU%\%WINCEDEBUG%\ Basically, you just rebuild your new kernel, the rest is to "blddemo clean -q" to have your new kernel SYSGEN'd and include in your OS Image. Or just "set WINCEREL=1" then "sysgen -p common nk nkprof" and "makeimg" if you can't wait another minutes for "blddemo clean -q" Tat sounds good, but some of you may not like the idea to alter any code in private folder, and not to mention how annoying to backup/restore files every time. Better idea? Yes, Microsoft provides a tool SYSGEN_CAPTURE (http://msdn.microsoft.com/en-us/library/ee504678.aspx for detail and usage) to creates Sources files for public drivers that you want to modify and build in your platform directory. In fact, not only public drivers, virtually anything in the %_WINCEROOT%\public\<project name>\cesysgen\makefile can be captured, and of course including kernel. So I am going to introduce a second way to build your own kernel by using SYSGEN_CAPTURE tool. Again the steps Create a folder in your BSP for building kernel, says %_TARGETPLATROOT%\SRC\Kernel. Use "SYSGEN_CAPTURE -p common nk" and then you will get a SOURCES.KERN, you could also "SYSGEN_CAPTURE -p common nkprof" to generate profiler enabled kernel. rename the SOURCE.KERN to SOURCES and copy one of the sample makefile into your kernel directory. For example the one in PRIVATE\WINCEOS\COREOS\NK\KERNEL\NKNORMAL. Copy the source files you want to modify from private\winceos\coreos\nk\kernel\ into your kernel directory. Modifying the SOURCES= macro to the source files you addes in step 4. For example, if you copied the vm.c, it is going to be SOURCES=vm.c Refer to the private\winceos\COREOS\nk\kernel\sources.inc and add macro defines and proper include path in your SOURCES file. "set WINCEREL=1", "build -c" in your kernel directory and "makeimg", voila! Here is an example for the MACROS you need to add in x86 Here are the macros for x86 CDEFINES=$(CDEFINES) -DIN_KERNEL -DWINCEMACRO -DKERN_CORE # Machine independent defines CDEFINES=$(CDEFINES) -DDBGSUPPORT _COREOSROOT=$(_WINCEROOT)\private\winceos\coreos INCLUDES=$(_COREOSROOT)\inc;$(_COREOSROOT)\nk\inc !IFDEF DP_SETTINGS CDEFINES=$(CDEFINES) -DDP_SETTINGS=$(DP_SETTINGS) !ENDIF ASM_SAFESEH=1 CDEFINES=$(CDEFINES) -Gs100000 -DENCODE_GS_COOKIE

    Read the article

  • E-Business Suite Plug-in 12.1.0.1 for Enterprise Manager 12c Now Available

    - by Steven Chan (Oracle Development)
    Oracle E-Business Suite Plug-in 12.1.0.1.0 is now available for use with Oracle Enterprise Manager 12c.  Oracle E-Business Suite Plug-in 12.1.0.1 is an integral part of Oracle Enterprise Manager 12 Application Management Suite for Oracle E-Business Suite. This latest plug-in extends EM 12c Cloud Control with E-Business Suite specific system management capabilities and features enhanced change management support. The Oracle Enterprise Manager 12c Application Management Suite for Oracle E-Business Suite includes: Oracle E-Business Suite Plug-in 12.1.0.1 combines functionality that was available in the previously-standalone Application Management Pack for Oracle E-Business Suite and Application Change Management Pack for Oracle E-Business Suite with Oracle Real User Experience Insight Oracle Configuration & Compliance capabilities  Features that were previously available in the standalone management packs are now packaged in the Oracle E-Business Suite Plug-in, which is certified with Oracle Enterprise Manager 12c Cloud Control:  Functionality previously available for Application Management Pack (AMP) is now classified as “System Management for Oracle E-Business Suite” within the plug-in. Functionality previously available for Application Change Management Pack (ACMP) is now classified as “Change Management for Oracle E-Business Suite” within the plug-in. The Application Configuration Console and the Configuration Change Console are now native components of Oracle Enterprise Manager 12c. System Management Enhancements General Oracle Enterprise Manager 12c Base Platform uptake: All components of the management suite are certified with Oracle Enterprise Manager 12c Cloud Control. Security Privilege Delegation: The Oracle E-Business Suite Plug-in now extends Enterprise Manager’s privilege delegation through Sudo and PowerBroker to Oracle E-Business Suite Plug-in host targets. Privileges and Roles for Managing Oracle E-Business Suite: This release includes new ready-to-use target and resource privileges to monitor, manage, and perform Change Management functionality. Cloning Named Credentials Uptake in Cloning: The Clone module transactions now let users leverage the Named Credential feature introduced in Enterprise Manager 12c, thereby passing all the benefits of Named Credentials features in Enterprise Manager to the Oracle E-Business Suite Plug-in users. Smart Clone improvements: In addition to the existing 11i support that was available on previous releases, the new Oracle E-Business Suite Plug-in widens the coverage supporting Oracle E-Business Suite releases 12.0.x and 12.1.x. The new and improved Smart Clone UI supports the adding of "pre and post" custom steps to a copy of the ready-to-use cloning deployment procedure. Now a user can pass parameters to the custom steps through the interview screen of the UI as well as pass ready-to-use parameters to the custom steps. Additional configuration enhancements are included for configuring RAC targets databases, such as the ability to customize listener names and the option to configure with Virtual IP or Scan IP. Change Management Enhancements Customization Manager Support for longer file names: Customization Manager now handles file names up to thirty characters in length. Patch Manager Queuing of Patch Manager Runs: This feature allows patch runs to queue up if Patch Manager detects a specific target is in a blackout state. Multi-node system patching: The patch run interview has been enhanced to allow Enterprise Manager Administrator to choose which nodes adpatch will run on. New AD Administration Options: The patch run interview has been extended to include AD Administration Options "Relink Application Programs", "Generate Product Jars Files", "Generate Report Files", and "Generate Form Files". Downloads Fresh install For new customers or existing customers wishing to perform a fresh install Enterprise Manager Store (within Enterprise Manager 12c) Oracle Software Delivery Cloud Upgrades For existing customers wishing to upgrade their AMP 4.0 or AMP 3.1 installations Oracle Technology Network Getting Started with Oracle E-Business Suite Plug-In, Release 12.1.0.1 (Note 1434392.1) Prerequisites Enterprise Manager Cloud Control 12cOne or more of the following Oracle E-Business Suite Releases Release 11.5.10 CU2 with 11i.ATG_PF.H.RUP6 or higher Release 12.0.4 with R12.ATG_PF.A.delta.6 Release 12.1 with R12.ATG_PF.B.delta.3 Platforms and OS Release certification information is available from My Oracle Support via the Certification page. Search for "Oracle Application Management Pack for Oracle E-Business Suite and release 12.1.0.1.0." Related Articles Oracle E-Business Suite Plug-in 4.0 Released for OEM 11g (11.1.0.1)

    Read the article

  • Join us on our Journey to be #1 in SaaS!

    - by jessica.ebbelaar(at)oracle.com
    WHY ORACLE? Oracle is a robust organization that has proven to maintain growth and innovation at all levels with a constant evolving attitude. The main ingredient of Oracles success is the 105.000 talented employees who constantly amaze each other in building a better and more innovative organization. Oracle is a company where YOU can make a difference. What is OD? Oracle Direct is a state-of-the-art, multi-channel EMEA sales operation bringing to life the benefits of Oracle’s complete technology stack. It offers you the unique opportunity to work with the most talented and like-minded sales professionals in the industry.  You will have access to world class training and structured career development programmes allowing you to accelerate your Solution Sales career across a multitude of product lines and a choice of attractive locations. What positions are OD Hiring?   Oracle is on a journey to be the #1 SaaS vendor in EMEA.  Due to recent expansion and acquisitions within our Cloud Business, we are now growing our EMEA Cloud Applications Sales Group in Dublin. We have many exciting NEW opportunities across our CRM and HCM SaaS Sales teams. As a SaaS Sales Account Manager, you will proactively manage an assigned territory / vertical with responsibility for the full sales cycle. This role requires strong business development, solution selling, account management and closing skills. WHY ORACLE? Oracle is a robust organization that has proven to maintain growth and innovation at all levels with a constant evolving attitude. The main ingredient of Oracles success is the 105.000 talented employees who constantly amaze each other in building a better and more innovative organization. Oracle is a company where YOU can make a difference. What is OD? Oracle Direct is a state-of-the-art, multi-channel EMEA sales operation bringing to life the benefits of Oracle’s complete technology stack. It offers you the unique opportunity to work with the most talented and like-minded sales professionals in the industry.  You will have access to world class training and structured career development programmes allowing you to accelerate your Solution Sales career across a multitude of product lines and a choice of attractive locations. What positions are OD Hiring? Oracle is on a journey to be the #1 SaaS vendor in EMEA.  Due to recent expansion and acquisitions within our Cloud Business, we are now growing our EMEA Cloud Applications Sales Group in Dublin. We have many exciting NEW opportunities across our CRM and HCM SaaS Sales teams. As a SaaS Sales Account Manager, you will proactively manage an assigned territory / vertical with responsibility for the full sales cycle. This role requires strong business development, solution selling, account management and closing skills. What is the Business Development Group (BDG) The Business Development Group is the key entry point in Oracle for the future Sales and Management talent of the organisation. We are the Demand Generation engine for Oracle in EMEA. We provide revenue generating, quality sales pipeline to our Inside and Field Sales professionals as well as to our Channel Partners. Our current focus is to provide an agile and flexible service offering to our customers and stakeholders to meet ever changing business needs, whilst constantly striving to improve the customer experience, quality of our pipeline, market coverage and penetration. As a SaaS Business Development Consultant (BDC) you will be the first touch point with new customers. Your goal is to proactively identify and qualify business opportunities leading to revenue for Oracle. You will work closely with your Inside Sales colleagues who will progress your qualified pipeline and opportunities. Work for us Work for the only multi-pillar SaaS vendor in the market Be part of a FUN, fast paced and truly International sales team  Develop you solution sales EXPERTISE Drive your CAREER development within a structured and supportive environment The Profile You have a passion for selling cutting-edge technology You thrive in a fast paced and dynamic work environment where being the best is paramount Your priority is always the customer You live for a challenge and you love to win Join us on our Journey to be #1 in SaaS and be part of our Cloud Success Story! You will find more information about open roles here

    Read the article

  • The Healthy Tension That Mobility Creates

    - by Kathryn Perry
    A guest post by Hernan Capdevila, Vice President, Oracle Fusion Apps In my previous post, I talked about the value of the mobile revolution on businesses and workers. Now let me put on a different hat and view the world from the IT department and the IT leader’s viewpoint. The IT leader has different concerns – around privacy, potential liability of information leakage, and intellectual property protection. These concerns and the leader’s goals create a healthy tension with the users. For example, effective device management becomes a must have for the IT leader, especially if you look at the Android ecosystem as an example. There are benefits to the Android strategy, but there are also drawbacks, such as uniformity – in device management, in operating systems, and in the application taxonomy and capabilities. Whereas, if you compare Android to iOS, Apple's operating system, iOS is more unified, more streamlined, and easier to manage. In either case, this is where mobile device management in the cloud makes good sense. I don't think IT departments should be hosting device management and managing that complexity. It should be a cloud service and I predict it's going to be key for our customers. A New Focus for IT Departments So where does that leave the IT departments? I think their futures are in governance, which is a more strategic play than a tactical one. Device management is tactical and it's the “now” topic. But the mobile phenomenon, if you will, is going to drive significant change in terms of how IT plans, hosts, and deploys enterprise applications. For example, opening up enterprise applications for mobile users presents some challenges unless you deploy more complicated network topologies, such as virtual private networks and threat protection technology. If you really want employees to be mobile you need to remove those kinds of barriers. But I don’t think IT departments want to wrestle with exposing their private enterprise data centers and being responsible for hosted business applications – applications in a sense that they’re making vulnerable to the public world. This opens up a significant need and a significant driver for cloud applications. However, it's not just about taking away the complexity – it's also about taking away the responsibility. Why should every business have to carry the responsibility and figure out all the nuts and bolts of how to protect themselves in this public, mobile world? When you use apps in the cloud, either your vendor or your hosting partner should have figured all that out. They need to assure the business that they are adhering to all sorts of security and compliance regulations so users can be connected and have access to information anywhere anytime. More Ideas and Better Service What’s more interesting is the world of possibilities that the connected, cloud-based world enables. I believe that the one-size-fits-all, uber-best practices, lowest-common denominator-like capabilities will go away. IT will now be able to solve very specific business challenges for the different corporate functions it serves. In this new world, IT will play a key role in enabling different organizations within a company to be best in class and delivering greater value to the line of business managers. IT will actually help to differentiate. Net result is a more agile workforce and business because each department is getting work done its own way.

    Read the article

  • OPN Exchange General Sessions –Fowler, Kurian & More!

    - by Kristin Rose
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} With so much excitement about to take place at OPN Exchange @ OpenWorld, it’s hard to decide what to attend, where to go, who to meet and what to eat! Let us help you decide by first asking a question… How often do you get to choose between seven key Oracle Executives as they address the five biggest topics facing the industry today? After the Partner Keynote with Judson Althoff, join us for the OPN Exchange General Sessions: DATE: Sunday September 30th TIME: 3:30-4:30 pm LOCATION: Moscone South, Esplanade Level John Fowler & Tom LaRocca (Technology for Partners: Room 306): Learn how to grow your top and bottom line by selling Oracle on Oracle. Chris Leone (Applications for Partners: Room 303): Catch the partner-only value prop, selling secrets and competitive compares to win with the Fusions Applications product family. Angelo Pruscino & Sohan DeMel (Engineered Systems for Partners: Room 301): Get the secrets to selling and implementing Oracle’s transformational Engineered Systems products. You won’t want to miss the Oracle Database Appliance Unplugged demonstration! Sonny Singh (Industry Solutions: Room 302): Develop profitable practices answering the challenges faced by companies operating in discrete industries and the opportunity represented by Machine2Machine Java. Thomas Kurian (Cloud for Partnesr: Room 304): Today it is all about the Cloud. Oracle offers both traditional cloud infrastructure solutions, as well cloud platform and software services. Attend this session to learn more about Oracle’s Platform, Application, and Social cloud services. Put on your thinking caps because these speakers are ready to blow your mind with five tracks of exclusive content catered to you, our partners. Boom! The OPN Communication Team Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

    Read the article

  • Move over DFS and Robocopy, here is SyncToy!

    - by andywe
    Ever since Windows 2000, I have always had the need to replicate data to multiple endpoints with the same content. Until DFS was introduced, the method of thinking was to either manually copy the data location by location, or to batch script it with xcopy and schedule a task. Even though this worked (and still does today), it was cumbersome, and intensive on the network, especially when dealing with larger amounts of data. Then along came robocopy, as an internal tool written by an enterprising programmer at Microsoft. We used it quite a bit, especially when we could not use DFS in the early days. It was received so well, it made it into the public realm. At least now we could have the ability to determine what files had changed and only replicate those. Well, over time there has been evolution of this ideal. DFS is obviously the Windows enterprise class service to do this, along with BrancheCache..however you don’t always need or want the power of DFS, especially when it comes to small datacenter installations, or remote offices. I have specific data sets that are on closed or restricted networks, that either have a security need for this, or are in remote countries where bandwidth is a premium. FOr this, I use the latest evolution for one off replication names Synctoy. Synctoy is from Microsoft, seemingly released in 2009, that wraps a nice GUI around setting up a paired set of folders (remember the mobile briefcase from Windows 98?), and allowing you the choice of synchronization methods. 1 way, or 2 way. Simply create a paired set of folders on the source and destination, choose your options for content, exclude any file types you don’t want to replicate, and click run. Scheduling is even easier. MS has included a wrapper for doing just this so all you enter in your task schedule in the SynToyCMD.exe, a –R as an argument, and the time schedule. No more complicated command lines or scripts.   I find this especially useful when I use MS backup to back up a system volume, but only want subsets of backup information of a data share and ONLY when that dataset has changed. Not relying on full backups and incremental. An example of this is my application installation master share. I back this up with SyncToy because I do not need multiple backup copies..one copy elsewhere suffices to back it up. At home, very useful for your pictures, videos, music, ect..the backup is online and ready to access, not waiting for you to restore a backup file, and no need to institute a domain simply to have DFS.'   Do note there is a risk..if you accidently delete a file and do not catch this before the next sync, then depending on your SyncToy settings, you can indeed lose that file as the destination updates..so due diligence applies. I make it a rule to sync manly one way…I use my master share for making changes, and allow the schedule to follow suit. Any real important file I lock down as read only through file permissions so it cannot be deleted unless I intervene.   Check out the tool and have some fun! http://www.microsoft.com/en-us/download/details.aspx?DisplayLang=en&id=15155

    Read the article

  • Curl Error 52 Empty reply from server

    - by Paul Sheldrake
    Hello I have a cron job setup on one server to run a backup script in PHP that is hosted on another server. The command I've been using is formatted like this: curl -sS http://www.example.com/backup.php Lately I've been getting this error when the Cron runs curl: (52) Empty reply from server I have no idea what this means. If I go to the link directly in my browser the script runs fine and I get my little backup zip file. Can anyone help? Thanks, Paul

    Read the article

  • Good Secure Backups Developers at Home

    - by slashmais
    What is a good, secure, method to do backups, for programmers who do research & development at home and cannot afford to lose any work? Conditions: The backups must ALWAYS be within reasonably easy reach. Internet connection cannot be guaranteed to be always available. The solution must be either FREE or priced within reason, and subject to 2 above. Status Report This is for now only considering free options. The following open-source projects are suggested in the answers (here & elsewhere): BackupPC is a high-performance, enterprise-grade system for backing up Linux, WinXX and MacOSX PCs and laptops to a server's disk. Storebackup is a backup utility that stores files on other disks. mybackware: These scripts were developed to create SQL dump files for basic disaster recovery of small MySQL installations. Bacula is [...] to manage backup, recovery, and verification of computer data across a network of computers of different kinds. In technical terms, it is a network based backup program. AutoDL 2 and Sec-Bk: AutoDL 2 is a scalable transport independant automated file transfer system. It is suitable for uploading files from a staging server to every server on a production server farm [...] Sec-Bk is a set of simple utilities to securely back up files to a remote location, even a public storage location. rsnapshot is a filesystem snapshot utility for making backups of local and remote systems. rbme: Using rsync for backups [...] you get perpetual incremental backups that appear as full backups (for each day) and thus allow easy restore or further copying to tape etc. Duplicity backs directories by producing encrypted tar-format volumes and uploading them to a remote or local file server. [...] uses librsync, [for] incremental archives Other Possibilities: Using a Distributed Version Control System (DVCS) such as Git(/Easy Git), Bazaar, Mercurial answers the need to have the backup available locally. Use free online storage space as a remote backup, e.g.: compress your work/backup directory and mail it to your gmail account. Strategies See crazyscot's answer

    Read the article

  • bat file using winrar taking too long to run

    - by Jessie
    hi guys, i have this scripts which extracts all my folder's and files from my c:\projects locations and put its in winrar and transfers them to c:\backup\project for /f "delims==" %%D in ('DIR C:\projects /A /B /S') do ( "C:\Program Files\WinRAR\WinRAR.EXE" m -r "c:\backup\projects.rar" "%%D" ) i have also tried the below script which uses the same source c:\projects but put them in their own separate winrar folder like in the source then transfers the folders into my c:\backup. FOR /F "DELIMS==" %%D in ('DIR C:\projects /AD /B') DO ( "C:\Program Files\WinRAR\WinRAR.EXE" m -r "C:\Backup\%%D.rar" "%%D" ) my question is, my second scripts only takes two hours to run when my first script takes over 24 hours to run, is there any way to make my first script faster? if anything shouldn't my first script be faster?

    Read the article

  • SQL Server 2008 and MySQL Daily Backups

    - by Tyler
    Is there a quick and easy way to backup both SQL Server 2008 and MySQL, all their databases? Right now I have a batch script that runs, but I have to manually add a database each and every time, and I'm sick of maintaining it. So I want to set it up to backup all SQL Server and then all MySQL, I dont care if its two different solutions, just want the ability to backup all the databases without having to type them in. Thank you.

    Read the article

  • What are the Pros & Cons of using SQL Azure for existing apps on dedicated servers

    - by Mark Redman
    We currently own our own servers, and rent a rack in a datacentre. Looking at the pricing, scalabilty and SLAs for Azure SQL, I am thinking that it might be viable to only use Azure SQL but continue to use our existing applications on our own servers in a datacentres. This will enable us to not worry about the database and its infrastructure so we can concentrate on building an application server farm with disk storeage for files etc. Our application is quite big and has various windows services and parts of it used unmanaged libraries that may not be feasible in the cloud, so probably coulnt have everything in the Azure cloud. The pros: Reduced Total Cost of ownership (no database servers, no sql server licenses) The Cons: I guess there would be overhead in the transfer of data between the Azure Cloud and our datacentre (ie cloud may be in US and datacentre is in the UK) but would this overhead be usable?

    Read the article

  • SQL Server 2008 - Shrinking the Transaction Log - Any way to automate?

    - by Albert
    I went in and checked my Transaction log the other day and it was something crazy like 15GB. I ran the following code: USE mydb GO BACKUP LOG mydb WITH TRUNCATE_ONLY GO DBCC SHRINKFILE(mydb_log,8) GO Which worked fine, shrank it down to 8MB...but the DB in question is a Log Shipping Publisher, and the log is already back up to some 500MB and growing quick. Is there any way to automate this log shrinking, outside of creating a custom "Execute T-SQL Statement Task" Maintenance Plan Task, and hooking it on to my log backup task? If that's the best way then fine...but I was just thinking that SQL Server would have a better way of dealing with this. I thought it was supposed to shrink automatically whenever you took a log backup, but that's not happening (perhaps because of my log shipping, I don't know). Here's my current backup plan: Full backups every night Transaction log backups once a day, late morning (maybe hook the Log shrinking onto this...doesn't need to be shrank every day though) Or maybe I just run it once a week, after I run a full backup task? What do you all think?

    Read the article

  • Grails Deployment - Fastest way to get deployed?

    - by gav
    Hi All, If anyone has or is running a Grails application on their server I would appreciate some details on where to go after creating the WAR. Background I chose grails because with Google App Engine and the App Engine Plugin deployment should have been trivial. This issue is that there is a bug which makes any application pretty much unusable, I wish this had been more prominent so I didn't have to get to the point of seeing the error myself before I was aware of it. The next option was EC2 and the Cloud Tools plugin, it seems Cloud Tools worked with grails 1.0 but doesn't work with the current 1.2.1 due to issues getting the JAR dependencies. It also seems that Cloud Tools has been succeeded by Cloud Foundry which is in beta, will cost extra money and has limited places (I signed up but haven't got an e-mail). Question My application is painfully trivial, it has a small load, small data requirements and doesn't need to scale past 5 users. How can I deploy my grails app as quickly and painlessly as possible? Specifically: Are there any hosting companies that have tomcat installed on their servers out of the box that I can sign up to and use that will just work? Do you know of any simple tutorials for getting a grails application deployed to EC2 without Cloud Tools? Thanks in advance, Gav Side-note: I picked grails because of good advice from SO, it should have been a very short time from development to deployed product except the tools for auto-deployment aren't that mature and I've never configured a server before.

    Read the article

  • In terminal, merging multiple folders into one.

    - by Josh Pinter
    I have a backup directory created by WDBackup (western digital external HD backup util) that contains a directory for each day that it backed up and the incremental contents of just what was backed up. So the hierarchy looks like this: 20100101 My Documents Letter1.doc My Music Best Songs Every First Songs.mp3 My song.mp3 # modified 20100101 20100102 My Documents Important Docs Taxes.doc My Music My Song.mp3 # modified 20100102 ...etc... Only what has changed is backed up and the first backup that was ever made contains all the files selected for backup. What I'm trying to do now is incrementally copy, while keeping the folder structure, from oldest to newest, each of these dated folders into a 'merged' folder so that it overrides the older content and keeps the new stuff. As an example, if just using these two example folders, the final merged folder would look like this: Merged My Documents Important Docs Taxes.doc Letter1.doc My Music Best Songs Every First Songs.mp3 My Song.mp3 # modified 20100102 Hope that makes sense. Thanks, Josh

    Read the article

  • Sending STDERR to logger

    - by Gnutt
    Im writing a bash-script to perform an offsite backup, using rsync over SSH. I'm able to send STDOUT to logger, for logs via rsync --del -az -e 'ssh -i mycrt.crt' /home/gnutt/backup/ me@offisite:backup | logger -i But I want to send STDERR instead, so if there is a problem, such as that offsite is unavailable, that output should be sent to logger and logged.

    Read the article

  • Should I be regularly shrinking my DB or at least my log file?

    - by Tom
    My question is, should I be running one or both of the shrink command regularly, DBCC SHRINKDATABASE OR DBCC SHRINKFILE ============================= background Sql Server: Database is 200 gigs, logs are 150 gigs. running this command SELECT name ,size/128.0 - CAST(FILEPROPERTY(name, 'SpaceUsed') AS int) / 128.0 AS AvailableSpaceInMB FROM sys.database_files;` produces this output.. MyDB: 159.812500 MB free MyDB_Log: 149476.390625 MB free So it seems there is some free space. We backup transaction logs every hour, diff backup 5 nights a week, full backup the other 2 nights of the week.

    Read the article

  • Should I go along with my choice of web hosting company or still search?

    - by Devner
    Hi all, I have been searching for a good website hosting company that can offer me all the services that I need for hosting my PHP & MySQL based website. Now this is a community based website and users will be able to upload pictures, etc. The hosting company that I have in mind, currently lets me do everything... let me use mail(), supports CRON jobs, etc. Of course they are charging about $6/month. Now the only problem with this company is that they have a limit of 50,000 files that can exist within the hosting account at any time. This kind of contradicts their frontpage ad of "UNLIMITED SPACE" on their website. Apart from this, I know of no other reason why I should not go with this hosting company. But my issue is that 50,000 file limit is what I cannot live with, once the users increase in significant number and the files they upload, exceed 50,000 in number. Now since this is a dynamic website and also includes sensitive issues like payments, etc. I am not sure if I should go ahead with this company as I am just starting out and then later switch over to a better hosting company which does not limit me with 50,000 files. If I need to switch over once I host with this company, I will need to take backups of all the files located in my account (jpg, zip, etc.), then upload them to the new host. I am not aware of any tools that can help me in this process. Can you please mention if you know any? I can go ahead with the other companies right now, but their cost is double/triple of the current price and they all sport less features than my current choice. If I pay more, then they are ready to accommodate my higher demands. Unfortunately, the company that I am willing to go with now, does NOT have any other higher/better plans that I can switch to. So that's the really really bad part. So my question(s): Since I am starting out with my website and since the scope of users initially is going to be less/small, should I go ahead with the current choice and then once the demand increases, switch over to a better provider? If yes, how can I transfer my database, especially the jpg files, etc. to the new provider? I don't even know the tools required to backup and restore to another host. (I don't like this idea but still..) Should I go ahead and pay more right now and go with better providers (without knowing if the website is going to do really that well) just for saving myself the trouble of having to take a backup of the 50,000 files and upload to a new host from an old host and just start paying double/triple the price without even knowing if I would receive back the returns as I expected? Backup and Restore in such a bulky numbers is something that I have never done before and hence I am stuck here trying to decide what to do. The price per month is also a considerable factor in my decision. All these web hosting companies say one common thing: It is customers responsibility to backup and restore data and they are not liable for any loss. So no matter what hosting company that I would like to go with, they ask me to take backup via FTP so that I can restore them whenever I want (& it seems to be safer to have the files locally with me). Some are providing tools for backup and some are not and I am not sure how much their backup tools can be trusted considering the disclaimers they have. I have never backed-up and restored 50,000 files from one web host to another, so please, all you experienced people out there, leave your comments and let me know your suggestions so that I can decide. I have spent 2 days fighting with myself trying to decide what to do and finally concluded that this is a double-edged sword and I can't arrive at a satisfactory final decision without involving others suggestions. I believe that someone must be out there who may have had such troublesome decision to make. So all your suggestions to help me make my decision are appreciated. Thank you all.

    Read the article

  • Query about the service or technology behind gmail service

    - by user1726908
    I am a final year computer science student. I am studying in hyderabad, andhra pradesh, india. I have come to know that the gmail is a cloud service. I am very much interested in learning more about cloud computing. This technology has been puzzling,tickling,increasing my curiosity and i just want to learn as much as i can about it. And through experience, i have learnt that practically doing can improve our knowledge and thirst to learn more. Thus, I would like to know "what are the security measures which you have taken to keep the cloud service like gmail secure and authentic? What is the architecture of the service? What are the technologies used in building it? What are the different levels of security applied in general for building a private cloud?"

    Read the article

  • Bash script to find a directory, list it's contents and sub-folders info

    - by lithiumion
    Hi I want to write a script that will: 1- locate folder "store" on a *nix filesystem 2- move into that folder 3- print list of contents with last modification date 4- calculate sub-folders size This folder's absolute path changes from server to server, but the folder name remains the same always. There is a config file that contains the correct path to that folder though, but it doesn't give absolute bath to it. Sample Config: Account ON DIR-Store /hdd1 Scheduled YES ?According to the config file the absolute path would be "/hdd1/backup/store/" I need the script to grep the "/hdd1" or anything beyond the word "Config-Store", add "/backup/store/" to it, move into folder "store", print list of it's contents, and calculate sub-folders size. Until now I manually edit the script on each server to reflect the path to the "store" folder. Here is a sample script: #!/bin/bash echo " " echo " " echo "Moving Into Directory" cd /hdd1/backup/store/ echo "Listing Directory Content" echo " " ls -alh echo "*******************************" sleep 2 echo " " echo "Calculating Backup Size" echo " " du -sh store/* echo "********** Done! **********" I know I could use grep cat /etc/store.conf | grep DIR-Store Just don't know how to get around selecting the path, adding the "/backup/store/" and moving ahead. Any help will be appreciated

    Read the article

  • Determine if app is running in azure or not.

    - by longday
    I have an asp.net mvc app that is built to run as standard web app in iis or in the cloud. I need to be able to determine if the app is being hosted in azure(dev fabric or cloud) or if it is being run as standard web app under iis. How can I tell if it is running in cloud?

    Read the article

  • Rsync module path needs to be a home directory

    - by Malfist
    I'm trying to use rsync to backup windows servers to an rsync server. I'm having problems with rsync on the linux side though, it doesn't like symlinks. Currently I'm trying to use the module path of ~/backup, but rsync says that the chroot failed. I looked up what to do and saw that I needed to add the option use chroot = no and munge symlinks = no. That fixed the @ERROR: chroot failed but now it's telling me @ERROR: chdir failed and the log files say that there is no ~/backup directory. I know the user I'm authenticating with has a backup folder in his directory. How can I fix this? For reference I'm using a .NET port of rsync called NetSync and tunneling it over a port forwarded SSH connection generated with granados.

    Read the article

< Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >