Search Results

Search found 5634 results on 226 pages for 'conflicting libraries'.

Page 120/226 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • How to Eliminate Tape Backup and Off-site Storage Service?

    - by Daniel Lucas
    PLEASE READ UPDATE AT THE BOTTOM. THANKS! ;) Environment Info (all Windows): 2 sites 30 servers site #1 (3TB of backup data) 5 servers site #2 (1TB of backup data) MPLS backbone tunnel connecting site #1 and site #2 Current Backup Process: Online Backup (disk-to-disk) Site #1 has a server running Symantec Backup Exec 12.5 with four 1TB USB 2.0 disks. BE jobs for full backups run nightly on all servers in site #1 to these disks. Site #2 backs up to a central file server there using software they already had when we purchased them. A BE job pulls that data nightly to site #1 and stores them on said disks. Off-site Backup (tape) Connected to our backup server is a tape drive. BE backs up the external disks to tape once a week which gets picked up by our off-site storage company. Obviously we rotate two tape libraries, one is always here and one is always there. Requirements: Eliminate the need for tape and off-site storage service by doing disk-to-disk at each site and replicating site #1 to site #2 and vice versa. Software based solution as hardware options have been too pricey (ie, SonicWall, Arkeia). Agents for Exchange, SharePoint, and SQL. Some Ideas So Far: Storage DroboPro at each site with an initial 8TB of storage (these are expandable up to 16TB at present). I like these because they are rackmountable, allow disparate drives, and have iSCSI interfaces. They are relatively cheap too. Software Symantec Backup Exec 12.5 already has all the agents and licenses we need. I'd like to keep using it unless there is a better solution, similarly priced, that does everything BE does plus deduplication and replication. Server Because there is no more need for a SCSI adapter (for tape drive) we are going to virtualize our backup server as it is currently the only physical machine save for SQL boxes. Problems: When replicating between sites we want as little data as possible to go across the pipe. There is no deduplication or compression in what I have laid out here so far. The files being replicated are BE's virtual tape libraries from our disk-to-disk backup. Because of this each of those huge files will go across the wire every week because they change every day. And Finally, the Question: Is there any software out there that does deduplication, or at least compression, to handle just our site-to-site replication? Or, looking at our setup, is there any other solution that I am missing that might be cheaper, faster, better? Thanks. Sorry so long. UPDATE 2: I've set a bounty on this question to get it more attention. I'm looking for software that will handle replication of data between two sites using the least amount of data possible (either compression, deduplication, or some other method). Something similar to rsync would work but it needs to be native to Windows and not a port involving shenanigans to get up and running. Prefer a GUI based product and I don't mind shelling out a few bones if it works. Please, answers that meet the above criteria only. If you don't think one exists or if you think I'm being to restrictive keep it to yourself. If after seven days there is no answer at all, so be it. Thanks again everyone. UPDATE 2: I really appreciate everyone coming forward with suggestions. There is no way for me to try all of these before the bounty expires. For now I'm going to let this bounty run out and whoever has the most votes will get the 100 rep points. Thanks again!

    Read the article

  • Apache config that uses two document roots based on whether the requested resource exists in the first

    - by mattalexx
    Background I have a client site that consists of a CakePHP installation and a Magento installation: /web/example.com/ /web/example.com/app/ <== CakePHP /web/example.com/app/webroot/ <== DocumentRoot /web/example.com/app/webroot/store/ <== Magento /web/example.com/config/ <== Site-wide config /web/example.com/vendors/ <== Site-wide libraries The server runs Apache 2.2.3. The problem The whole company has FTP access and got used to clogging up the /web/example.com/, /web/example.com/app/webroot/, and /web/example.com/app/webroot/store/ directories with their own files. Sometimes these files need HTTP access and sometimes they don't. In any case, this mess makes my job harder when it comes to maintaining the site. Code merges, tarring the live code, etc, is very complicated and usually requires a bunch of filters. Abandoned solution At first, I thought I would set up a new subdomain on the same server, move all of their files there, and change their FTP chroot. But that wouldn't work for these reasons: Firstly, I have no idea (and neither do they remember) what marketing materials they've sent out that contain URLs to certain resources they've uploaded to the server, using the main domain, and also using abstract subdomains that use the main virtual host because it has ServerAlias *.example.com. So suddenly having them only use static.example.com isn't feasible. Secondly, The PHP scripts in their projects are potentially very non-portable. I want their files to stay in as similar an environment as they were built as I can. Also, I do not want to debug their code to make it portable. Half-baked solution After some thought, I decided to find a way to section off the actual website files into another directory that they would not touch. The company's uploaded files would stay where they were. This would ensure that I didn't break any of their projects that needed HTTP access. It would look something like this: /web/example.com/ <== A bunch of their files are in here /web/example.com/app/webroot/ <== 1st DocumentRoot; A bunch of their files are in here /web/example.com/app/webroot/store/ <== Some more are in here /web/example.com/site/ <== New dir; Contains only site files /web/example.com/site/app/ <== CakePHP /web/example.com/site/app/webroot/ <== 2nd DocumentRoot /web/example.com/site/app/webroot/store/ <== Magento /web/example.com/site/config/ <== Site-wide config /web/example.com/site/vendors/ <== Site-wide libraries After I made this change, I would not need to pay attention to anything except for the stuff within /web/example.com/site/ and my job would be a lot easier. I would be the only one changing stuff in there. So here's where the Apache magic would happen: I need an HTTP request to http://www.example.com/ to first use /web/example.com/app/webroot/ as the document root. If nothing is found (no miscellaneous uploaded company projects are found), try finding something within /web/example.com/site/app/webroot/. Another thing to keep in mind is, the site might have some problems if the $_SERVER['DOCUMENT_ROOT'] variable reads /web/example.com/app/webroot/ but the actual files are within /web/example.com/site/app/webroot/. It would be better if the DOCUMENT_ROOT environment variable could be /web/example.com/site/app/webroot/ for anything within the /web/example.com/site/app/webroot/ directory. Conclusion Is my half-baked solution possible with Apache 2.2.3? Is there a better way to solve this problem?

    Read the article

  • Bancassurers Seek IT Solutions to Support Distribution Model

    - by [email protected]
    Oracle Insurance's director of marketing for EMEA, John Sinclair, attended the third annual Bancassurance Forum in Vienna last month. He reports that the outlook for bancassurance in EMEA remains positive, despite changing market conditions that have led a number of bancassurers to re-examine their business models. Vienna is at the crossroads between mature Western European markets, where bancassurance is now an established best practice, and more recently tapped Eastern European markets that offer the greatest growth potential. Attendance at the Bancassurance Forum was good, with 87 bancassurance attendees, most in very senior positions in the industry. The conference provided the chance for a lively discussion among bancassurers looking to keep abreast of the latest trends in one of Europe's most successful distribution models for insurance. Even under normal business conditions, there is a great demand for best practice sharing within the industry as there is no standard formula for success.  Each company has to chart its own course and choose the strategies for sales, products development and the structure of ownership that make sense for their business, and as soon as they get it right bancassurers need to adapt the mix to keep up with ever changing regulations, completion and economic conditions.  To optimize the overall relationship between banking and insurance for mutual benefit, a balance needs to be struck between potentially conflicting interests. The banking side of the house is looking for greater wallet share from its customers and the ability to increase profitability by bundling insurance products with higher margins - especially in light of the recent economic crisis, where margins for traditional banking products are low and completion high. The insurance side of the house seeks access to new customers through a complementary distribution channel that is efficient and cost effective. To make the relationship work, it is important that both sides of the same house forge strategic and long term relationships - irrespective of whether the underlying business model is supported by a distribution agreement, cross-ownership or other forms of capital structure. However, this third annual conference was not held under normal business conditions. The conference took place in challenging, yet interesting times. ING's forced spinoff of its insurance operations under pressure by the EU Commission and the troubling losses suffered by Allianz as a result of the Dresdner bank sale were fresh in everyone's mind. One year after markets crashed, there is now enough hindsight to better understand the implications for bancassurance and best practices that are emerging to deal with them. The loan-driven business that has been crucial to bancassurance up till now evaporated during the crisis, leaving bancassurers grappling with how to change their overall strategy from a loan-driven to a more diversified model.  Attendees came to the conference to learn what strategies were working - not only to cope with the market shift, but to take advantage of it as markets pick up. Over the course of 14 customer case studies and numerous analyst presentations, topical issues ranging from getting the business model right to the impact on capital structuring of Solvency II were debated openly. Many speakers alluded to the need to specifically design insurance products with the banking distribution channel in mind, which brings with it specific requirements such as a high degree of standardization to achieve efficiency and reduce training costs. Moreover, products must be engineered to suit end consumers who consider banks a one-stop shop. The importance of IT to the successful implementation of bancassurance strategies was a theme that surfaced regularly throughout the conference.  The cross-selling opportunity - that will ultimately determine the success or failure of any bancassurance model - can only be fully realized through a flexible IT architecture that enables banking and insurance processes to be integrated and presented to front-line staff through a common interface. However, the reality is that most bancassurers have legacy IT systems, which constrain the businesses' ability to implement new strategies to maintaining competitiveness in turbulent times. My colleague Glenn Lottering, who chaired the conference, believes that the primary opportunities for bancassurers to extract value from their IT infrastructure investments lie in distribution management, risk management with the advent of Solvency II, and achieving operational excellence. "Oracle is ideally suited to meet the needs of bancassurance," Glenn noted, "supplying market-leading software for both banking and insurance. Oracle provides adaptive systems that let customers easily integrate hybrid business processes from both worlds while leveraging existing IT infrastructure." Overall, the consensus at the conference was that the outlook for bancassurance in EMEA remains positive, despite changing market conditions that have led a number of bancassurers to re-examine their business models. John Sinclair is marketing director for Oracle Insurance in EMEA. He has more than 20 years of experience in insurance and financial services.    

    Read the article

  • Building vs. Buying a Master Data Management Solution

    - by david.butler(at)oracle.com
    Many organizations prefer to build their own MDM solutions. The argument is that they know their data quality issues and their data better than anyone. Plus a focused solution will cost less in the long run then a vendor supplied general purpose product. This is not unreasonable if you think of MDM as a point solution for a particular data quality problem. But this approach carries significant risk. We now know that organizations achieve significant competitive advantages when they deploy MDM as a strategic enterprise wide solution: with the most common best practice being to deploy a tactical MDM solution and grow it into a full information architecture. A build your own approach most certainly will not scale to a larger architecture unless it is done correctly with the larger solution in mind. It is possible to build a home grown point MDM solution in such a way that it will dovetail into broader MDM architectures. A very good place to start is to use the same basic technologies that Oracle uses to build its own MDM solutions. Start with the Oracle 11g database to create a flexible, extensible and open data model to hold the master data and all needed attributes. The Oracle database is the most flexible, highly available and scalable database system on the market. With its Real Application Clusters (RAC) it can even support the mixed OLTP and BI workloads that represent typical MDM data access profiles. Use Oracle Data Integration (ODI) for batch data movement between applications, MDM data stores, and the BI layer. Use Oracle Golden Gate for more real-time data movement. Use Oracle's SOA Suite for application integration with its: BPEL Process Manager to orchestrate MDM connections to business processes; Identity Management for managing users; WS Manager for managing web services; Business Intelligence Enterprise Edition for analytics; and JDeveloper for creating or extending the MDM management application. Oracle utilizes these technologies to build its MDM Hubs.  Customers who build their own MDM solution using these components will easily migrate to Oracle provided MDM solutions when the home grown solution runs out of gas. But, even with a full stack of open flexible MDM technologies, creating a robust MDM application can be a daunting task. For example, a basic MDM solution will need: a set of data access methods that support master data as a service as well as direct real time access as well as batch loads and extracts; a data migration service for initial loads and periodic updates; a metadata management capability for items such as business entity matrixed relationships and hierarchies; a source system management capability to fully cross-reference business objects and to satisfy seemingly conflicting data ownership requirements; a data quality function that can find and eliminate duplicate data while insuring correct data attribute survivorship; a set of data quality functions that can manage structured and unstructured data; a data quality interface to assist with preventing new errors from entering the system even when data entry is outside the MDM application itself; a continuing data cleansing function to keep the data up to date; an internal triggering mechanism to create and deploy change information to all connected systems; a comprehensive role based data security system to control and monitor data access, update rights, and maintain change history; a flexible business rules engine for managing master data processes such as privacy and data movement; a user interface to support casual users and data stewards; a business intelligence structure to support profiling, compliance, and business performance indicators; and an analytical foundation for directly analyzing master data. Oracle's pre-built MDM Hub solutions are full-featured 3-tier Internet applications designed to participate in the full Oracle technology stack or to run independently in other open IT SOA environments. Building MDM solutions from scratch can take years. Oracle's pre-built MDM solutions can bring quality data to the enterprise in a matter of months. But if you must build, at lease build with the world's best technology stack in a way that simplifies the eventual upgrade to Oracle MDM and to the full enterprise wide information architecture that it enables.

    Read the article

  • Running Multiple WebLogic and OSB Domains

    - by jeff.x.davies
    I have any number of OSB domain created on my machine at any point in time. For example, I have different domains for different version of Oracle Service Bus and Oracle SOA Suite. I also have different domains for different purposes. I have a demo domain and another domain for the projects in my blog. Starting with OSB 11g and the Apache Derby server, there is a small "gotcha" if you want to create multiple domains on a devevelopment machine. When you create a new domain for OSB 11g it will use the same database info for all databases and this will cause an error when starting the admin server of the second domain (the first domain doesn't have to be running for this error to occur). Here is an example of the error message in the server console: ####<Mar 8, 2011 2:58:48 PM PST> <Critical> <JTA> <jeff-laptop> <AdminServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1299625128464> <BEA-110482> <A logging last resource failed during initialization. The server cannot boot unless all configured logging last resources (LLRs) initialize. Failing reason: weblogic.transaction.loggingresource.LoggingResourceException: java.sql.SQLException: JDBC LLR, table verify failed for table 'WL_LLR_ADMINSERVER', row 'JDBC LLR Domain//Server' record had unexpected value 'osb11gR1PS3//AdminServer' expected 'OSBCIM//AdminServer'*** ONLY the original domain and server that creates an LLR table may access it *** The solution is to create a database instance for each of your domains and this is very simple to do. After you create a domain using the Configuration Wizard, locate the wlsbjmsrpDataSource-jdbc.xml file that is found under the DOMAIN_HOME/config/jdbc directory. Near the top of the file you will see the following entry: <url>jdbc:derby://localhost:1527/osbexamples;create=true;ServerName=localhost;databaseName=osbexamples</url> You need to modify this entry with a different and unique database name. The easiest way to do this is to substiture the name of your domain. For example: <url>jdbc:derby://localhost:1527/mydomain;create=true;ServerName=localhost;databaseName=mydomain</url> will create a database named mydomain . Now, when you restart the admin server for the domain, it will create the new database for you. Do this for each domain you create on your development machine and you'll have no troubles. The process is much simpler if you are creating a domain using the Configuration Wizard. Simply name the database when you get to the Configure JDBC Component Schema step of the Configuration Wizard, select the OSB JMS Reporting Provider and set the name in the DBMS/Service field to whatever name you like, as shown in Figure 1 below. Figure 1 – Configuring the JDBC Component Schema That is all there is to it. Now you can create as many domains on your leptop or development machine as you like and not have to worry about them conflicting with each other.

    Read the article

  • ArchBeat Link-o-Rama Top 10 for December 9-15, 2012

    - by Bob Rhubart
    You click, we listen. The following list reflects the Top 10 most popular items posted on the OTN ArchBeat Facefbook page for the week of December 9-15, 2012. DevOps Basics II: What is Listening on Open Ports and Files – WebLogic Essentials | Dr. Frank Munz "Can you easily find out which WebLogic servers are listening to which port numbers and addresses?" asks Dr. Frank Munz. The good doctor has an answer—and a tech tip. Using OBIEE against Transactional Schemas Part 4: Complex Dimensions | Stewart Bryson "Another important entity for reporting in the Customer Tracking application is the Contact entity," says Stewart Bryson. "At first glance, it might seem that we should simply build another dimension called Dim – Contact, and use analyses to combine our Customer and Contact dimensions along with our Activity fact table to analyze Customer and Contact behavior." SOA 11g Technology Adapters – ECID Propagation | Greg Mally "Many SOA Suite 11g deployments include the use of the technology adapters for various activities including integration with FTP, database, and files to name a few," says Oracle Fusion Middleware A-Team member Greg Mally. "Although the integrations with these adapters are easy and feature rich, there can be some challenges from the operations perspective." Greg's post focuses on technical tips for dealing with one of these challenges. Podcast: DevOps and Continuous Integration In Part 1 of a 3-part program, panelists Tim Hall (Senior Director of product management for Oracle Enterprise Repository and Oracle’s Application Integration Architecture), Robert Wunderlich (Principal Product Manager for Oracle’s Application Integration Architecture Foundation Pack) and Peter Belknap (Director of product management for Oracle SOA Integration) discuss why DevOps matters and how it changes development methodologies and organizational structure. Good To Know - Conflicting View Objects and Shared Entity | Andrejus Baranovskis Oracle ACE Director Andrejus Baranovskis shares his thoughts -- and a sample application -- dealing with an "interesting ADF behavior" encountered over the weekend. Cloud Deployment Models | B. R. Clouse Looking out for the cloud newbies... "As the cloud paradigm grows in depth and breadth, more readers are approaching the topic for the first time, or from a new perspective," says B. R. Clouse. "This blog is a basic review of cloud deployment models, to help orient newcomers and neophytes." Service governance morphs into cloud API management | David Linthicum "When building and using clouds, the ability to manage APIs or services is the single most important item you can provide to ensure the success of the project," says David Linthicum. "But most organizations driving a cloud project for the first time have no experience handling a service-based architecture and don't see the need for API management until after deployment. By then, it's too late." Oracle Fusion Middleware Security: Password Policy in OAM 11g R2 | Rob Otto Rob Otto continues the Oracle Fusion Middleware A-Team "Oracle Access Manager Academy" series with a detailed look at OAM's ability to support "a subset of password management processes without the need to use Oracle Identity Manager and LDAP Sync." Understanding the JSF Lifecycle and ADF Optimized Lifecycle | Steven Davelaar Could you call that a surprise ending? Oracle WebCenter & ADF Architecture Team (A-Team) member learned a lot more than he expected while creating a UKOUG presentation entitled "What you need to know about JSF to be succesful with ADF." Expanding on requestaudit - Tracing who is doing what...and for how long | Kyle Hatlestad "One of the most helpful tracing sections in WebCenter Content (and one that is on by default) is the requestaudit tracing," says Oracle Fusion Middleware A-Team architect Kyle Hatlestad. Get up close and technical in his post. Thought for the Day "There is no code so big, twisted, or complex that maintenance can't make it worse." — Gerald Weinberg Source: SoftwareQuotes.com

    Read the article

  • As a person getting into mobile development, what's the best mobile platform in terms of profitability? [closed]

    - by Kyle Loman
    I realize this question can range very far so would love to hear any and all opinions on this. However, I'll be honest and say that I have been thinking of this in terms of most profitable. I know how this may sound either way but this is one of my main sticking points. I realize that I'm not guaranteed a single cent and success is never guaranteed but I'm going into this with the thought of making something out of it both financially and also for my own interest. I know that iOS gets a lot of attention on this front but Android commands a lot more market share. However, I know there are drawbacks to Android too, whether it's in the actual development process and programming (though I've heard conflicting reports on this, such as how easy/difficult it is for to address screen res in different devices) or the app ecosystem being flooded. But iOS's app ecosystem has been described as too saturated and harder to compete in for that reason. Since Windows Phone has fewer apps than both of those two, that might be the best place to start in order to be closer to the ground floor of the store and be noticed more? Less saturation = better chances of sales or differentiating? Something like the gold rush during the first years of the iOS App Store (not exactly but at least in concept)? Would it be that despite fewer users on the platform, there's more exposure due to less competition so that may translate to better success at sales? Plus, I know MS is in it for the long haul so I'm not too fearful of something like WebOS going away. Obviously RIM isn't very popular nowadays but I read a recent article that says Blackberry actually has the apps that make the most money, any thoughts on that: http://gigaom.com/mobile/which-mobile-oss-apps-make-most-money-surprise-its-blackberry/ Again, this is all I've heard or known about so if there's anything to add or correct here, please do. In addition, this has actually affected my next personal phone upgrade. I'm eligible for a carrier discount now and I've had my eye on the iPhone 5. However, the Lumia 920 is the one I'm holding out for and I'm open to trying an Android but I'm not sure I can wait that long for any new Nexus or even the Razr HD. Even the new Lumia in November is making me antsy, I'm so close to just getting an iPhone 5. But when I say this has affected my phone choice, I'd want to be able to carry the apps I write with me so that I'm able to pull my phone out to show people without having to carry around a second device to do so. So that's why I'd like to make my personal phone match the main platform I'm developing for. Of course, I will likely expand to other platforms if I gain any decent success but the one I target now would serve well as my personal phone I carry around so that I can use it as a marketing tool, in a sense, showing people my apps if the opportunity presents itself. So what's the best mobile platform to choose, and especially in regards to most lucrative? As said previously, this would influence my personal phone choice greatly. Thanks in advance and I hope this isn't taken the wrong way - I understand there are trade-offs and other factors that may balance this out but making some revenue is key among that. For some background, I have done software development and know programming language concepts so I'm not entirely new to it and I do get the notion of being familiar with these things so that I can translate this skill among a variety of languages but I'm currently just having difficulty choosing my first main mobile platform based on the criteria I've outlined above.

    Read the article

  • Oracle MAA Part 1: When One Size Does Not Fit All

    - by JoeMeeks
    The good news is that Oracle Maximum Availability Architecture (MAA) best practices combined with Oracle Database 12c (see video) introduce first-in-the-industry database capabilities that truly make unplanned outages and planned maintenance transparent to users. The trouble with such good news is that Oracle’s enthusiasm in evangelizing its latest innovations may leave some to wonder if we’ve lost sight of the fact that not all database applications are created equal. Afterall, many databases don’t have the business requirements for high availability and data protection that require all of Oracle’s ‘stuff’. For many real world applications, a controlled amount of downtime and/or data loss is OK if it saves money and effort. Well, not to worry. Oracle knows that enterprises need solutions that address the full continuum of requirements for data protection and availability. Oracle MAA accomplishes this by defining four HA service level tiers: BRONZE, SILVER, GOLD and PLATINUM. The figure below shows the progression in service levels provided by each tier. Each tier uses a different MAA reference architecture to deploy the optimal set of Oracle HA capabilities that reliably achieve a given service level (SLA) at the lowest cost.  Each tier includes all of the capabilities of the previous tier and builds upon the architecture to handle an expanded fault domain. Bronze is appropriate for databases where simple restart or restore from backup is ‘HA enough’. Bronze is based upon a single instance Oracle Database with MAA best practices that use the many capabilities for data protection and HA included with every Oracle Enterprise Edition license. Oracle-optimized backups using Oracle Recovery Manager (RMAN) provide data protection and are used to restore availability should an outage prevent the database from being able to restart. Silver provides an additional level of HA for databases that require minimal or zero downtime in the event of database instance or server failure as well as many types of planned maintenance. Silver adds clustering technology - either Oracle RAC or RAC One Node. RMAN provides database-optimized backups to protect data and restore availability should an outage prevent the cluster from being able to restart. Gold raises the game substantially for business critical applications that can’t accept vulnerability to single points-of-failure. Gold adds database-aware replication technologies, Active Data Guard and Oracle GoldenGate, which synchronize one or more replicas of the production database to provide real time data protection and availability. Database-aware replication greatly increases HA and data protection beyond what is possible with storage replication technologies. It also reduces cost while improving return on investment by actively utilizing all replicas at all times. Platinum introduces all of the sexy new Oracle Database 12c capabilities that Oracle staff will gush over with great enthusiasm. These capabilities include Application Continuity for reliable replay of in-flight transactions that masks outages from users; Active Data Guard Far Sync for zero data loss protection at any distance; new Oracle GoldenGate enhancements for zero downtime upgrades and migrations; and Global Data Services for automated service management and workload balancing in replicated database environments. Each of these technologies requires additional effort to implement. But they deliver substantial value for your most critical applications where downtime and data loss are not an option. The MAA reference architectures are inherently designed to address conflicting realities. On one hand, not every application has the same objectives for availability and data protection – the Not One Size Fits All title of this blog post. On the other hand, standard infrastructure is an operational requirement and a business necessity in order to reduce complexity and cost. MAA reference architectures address both realities by providing a standard infrastructure optimized for Oracle Database that enables you to dial-in the level of HA appropriate for different service level requirements. This makes it simple to move a database from one HA tier to the next should business requirements change, or from one hardware platform to another – whether it’s your favorite non-Oracle vendor or an Oracle Engineered System. Please stay tuned for additional blog posts in this series that dive into the details of each MAA reference architecture. Meanwhile, more information on Oracle HA solutions and the Maximum Availability Architecture can be found at: Oracle Maximum Availability Architecture - Webcast Maximize Availability with Oracle Database 12c - Technical White Paper

    Read the article

  • Welcome To The Nashorn Blog

    - by jlaskey
    Welcome to all.  Time to break the ice and instantiate The Nashorn Blog.  I hope to contribute routinely, but we are very busy, at this point, preparing for the next development milestone and, of course, getting ready for open source. So, if there are long gaps between postings please forgive. We're just coming back from JavaOne and are stoked by the positive response to all the Nashorn sessions. It was great for the team to have the front and centre slide from Georges Saab early in the keynote. It seems we have support coming from all directions. Most of the session videos are posted. Check out the links. Nashorn: Optimizing JavaScript and Dynamic Language Execution on the JVM. Unfortunately, Marcus - the code generation juggernaut,  got saddled with the first session of the first day. Still, he had a decent turnout. The talk focused on issues relating to optimizations we did to get good performance from the JVM. Much yet to be done but looking good. Nashorn: JavaScript on the JVM. This was the main talk about Nashorn. I delivered the little bit of this and a little bit of that session with an overview, a follow up on the open source announcement, a run through a few of the Nashorn features and some demos. The room was SRO, about 250±. High points: Sam Pullara, from Twitter, came forward to describe how painless it was to get Mustache.js up and running (20x over Rhino), and,  John Ceccarelli, from NetBeans came forward to describe how Nashorn has become an integral part of Netbeans. A healthy Q & A at the end was very encouraging. Meet the Nashorn JavaScript Team. Michel, Attila, Marcus and myself hosted a Q & A. There was only a handful of people in the room (we assume it was because of a conflicting session ;-) .) Most of the questions centred around Node.jar, which leads me to believe, Nashorn + Node.jar is what has the most interest. Akhil, Mr. Node.jar, sitting in the audience, fielded the Node.jar questions. Nashorn, Node, and Java Persistence. Doug Clarke, Akhil and myself, discussed the title topics, followed by a lengthy Q & A (security had to hustle us out.) 80 or so in the room. Lots of questions about Node.jar. It was great to see Doug's use of Nashorn + JPA. Nashorn in action, with such elegance and grace. Putting the Metaobject Protocol to Work: Nashorn’s Java Bindings. Attila discussed how he applied Dynalink to Nashorn. Good turn out for this session as well. I have a feeling that once people discover and embrace this hidden gem, great things will happen for all languages running on the JVM. Finally, there were quite a few JavaOne sessions that focused on non-Java languages and their impact on the JVM. I've always believed that one's tool belt should carry a variety of programming languages, not just for domain/task applicability, but also to enhance your thinking and approaches to problem solving. For the most part, future blog entries will focus on 'how to' in Nashorn, but if you have any suggestions for topics you want discussed, please drop a line.  Cheers. 

    Read the article

  • Does Jquery and Mootools usually have conflict if both are used on a webpage? [migrated]

    - by Charming Prince
    I have this website am designing, i tried using mootools 1.31 to animate some of the div boxes when clicked or when the mouse hover rounds it, to shows the content. the thing is that it doesn't seem to work on the webpage, but if i try the same script on a blank webpage it works, am thinking probably it's because i have Jquery 1.52 on the same page and maybe both scripts are conflicting with each other because, if i remove the Jquery, the Mootools works. What should be my option, because i need the Jquery to do some validations for me, so i can't remove it completely. Here are the codes <script> //-vertical var mySlide = new Fx.Slide('test'); $('slidein').addEvent('click', function(e){ e = new Event(e); mySlide.slideIn(); e.stop(); }); $('slideout').addEvent('click', function(e){ e = new Event(e); mySlide.slideOut(); e.stop(); }); $('toggle').addEvent('click', function(e){ e = new Event(e); mySlide.toggle(); e.stop(); }); $('hide').addEvent('click', function(e){ e = new Event(e); mySlide.hide(); e.stop(); }); </script> here's the HTML <html> <h3 class="section">Fx.Slide Vertical</h3> <a id="slideout" href="#">slideout</a> | <a id="slidein" href="#">slidein</a> | <a id="toggle" href="#"> toggle</a> | <a id="hide" href="#">hide</a> <div id="test"> Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad mi nim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </div> Here's the CSS #test { background: #222; color: #fff; padding: 10px; margin: 20px; border: 10px solid pink; } #test2 { background: #222; color: #fff; padding: 10px; margin: 20px; border: 10px solid pink; } Am using the exact same code supplied by Mootools in their own example, if i do this on a blank webpage it works but incorporated into my own webpage, it doesn't, and my own page just have the script tag of the Jquery in the head section of the HTML.

    Read the article

  • Confused about modifying the sprint backlog during a sprint

    - by Maltiriel
    I've been reading a lot about scrum lately, and I've found what seem to me to be conflicting information about whether or not it's ok to change the sprint backlog during a sprint. The Wikipedia article on scrum says it's not ok, and various other articles say this as well. Also my Software Development professor taught the same thing during an overview of scrum. However, I read Scrum and XP from the Trenches and that describes a section for unplanned items on the taskboard. So then I looked up the Scrum Guide and it says that during the sprint "No changes are made that would affect the Sprint Goal" and in the discussion of the Sprint Goal "If the work turns out to be different than the Development Team expected, then they collaborate with the Product Owner to negotiate the scope of Sprint Backlog within the Sprint." It goes on to say in the discussion of the Sprint Backlog: The Sprint Backlog is a plan with enough detail that changes in progress can be understood in the Daily Scrum. The Development Team modifies Sprint Backlog throughout the Sprint, and the Sprint Backlog emerges during the Sprint. This emergence occurs as the Development Team works through the plan and learns more about the work needed to achieve the Sprint Goal. As new work is required, the Development Team adds it to the Sprint Backlog. As work is performed or completed, the estimated remaining work is updated. When elements of the plan are deemed unnecessary, they are removed. Only the Development Team can change its Sprint Backlog during a Sprint. The Sprint Backlog is a highly visible, real-time picture of the work that the Development Team plans to accomplish during the Sprint, and it belongs solely to the Development Team. So at this point I'm altogether confused. Thinking about it, it makes more sense to me to take the second approach. The individual, specific items in the backlog don't seem to me to be the most important thing, but rather the sprint goal, so not changing the sprint goal but being able to change the backlog makes sense. For instance if both the product owner and the team thought they were on the same page about a story, but as the sprint progressed they figured out there was a misunderstanding, it seems like it makes sense to change the tasks that make up that story accordingly. Or if there was some story or task that was forgotten about, but is required to reach the sprint goal, I would think it would be best to add the story or task to the backlog during the sprint. However, there are a lot of people who seem quite adamant that any change to the sprint backlog is not ok. Am I misunderstanding that position somehow? Are those folks defining the sprint backlog differently somehow? My understanding of the sprint backlog is that it consists of both the stories and the tasks they're broken down into. Anyway I would really appreciate input on this issue. I'm trying to figure out both what the idealistic scrum approach is to changing the sprint backlog during a sprint, and whether people who use scrum successfully for development allow changing the sprint backlog during a sprint.

    Read the article

  • How to enable gzip HTTP compression on Windows Azure dynamic content

    - by Steven
    Hi all, I've been trying unsuccessfully to enable gzip HTTP compression on my Windows Azure hosted WCF Restful service which returns JSON only from GET and POST requests. I have tried so many things that I would have a hard time listing all of them, and I now realise I have been working with conflicting information (regarding old version of azure etc) so think it best to start with a clean slate! I am working with Visual Studio 2008, using the February 2010 tools for Visual Studio. So, according to the following link, HTTP compression has now been enabled .. http://msdn.microsoft.com/en-us/library/ff436045.aspx ... and I've used the advice at the following page (the URL compression advice only), but I get no compression. http://blog.smarx.com/posts/iis-compression-in-windows-azure <urlCompression doStaticCompression="true" doDynamicCompression="false" dynamicCompressionBeforeCache="true" /> It doesn't help that I don't know what the difference is between urlCompression and httpCompression. I've tried to find out but to no avail! Could the fact that the tools for Visual Studio were released before the version of Azure which supports compression be a problem? I read somewhere that with the latest tools, you can choose which version of Azure OS you want to use when you publish ... but I don't know if that's true, and if it is, I can't find where to choose. Could I be using a pre-http enabled version? I've also tried blowery http compression module, but no results. Does any one have any up-to-date advice on how to achieve this? i.e. advice that relates to the current version of the Azure OS. Cheers! Steven

    Read the article

  • Synchronizing Asynchronous request handlers in Silverlight environment

    - by Eric Lifka
    For our senior design project my group is making a Silverlight application that utilizes graph theory concepts and stores the data in a database on the back end. We have a situation where we add a link between two nodes in the graph and upon doing so we run analysis to re-categorize our clusters of nodes. The problem is that this re-categorization is quite complex and involves multiple queries and updates to the database so if multiple instances of it run at once it quickly garbles data and breaks (by trying to re-insert already used primary keys). Essentially it's not thread safe, and we're trying to make it safe, and that's where we're failing and need help :). The create link function looks like this: private Semaphore dblock = new Semaphore(1, 1); // This function is on our service reference and gets called // by the client code. public int addNeed(int nodeOne, int nodeTwo) { dblock.WaitOne(); submitNewNeed(createNewNeed(nodeOne, nodeTwo)); verifyClusters(nodeOne, nodeTwo); dblock.Release(); return 0; } private void verifyClusters(int nodeOne, int nodeTwo) { // Run analysis of nodeOne and nodeTwo in graph } All copies of addNeed should wait for the first one that comes in to finish before another can execute. But instead they all seem to be running and conflicting with each other in the verifyClusters method. One solution would be to force our front end calls to be made synchronously. And in fact, when we do that everything works fine, so the code logic isn't broken. But when it's launched our application will be deployed within a business setting and used by internal IT staff (or at least that's the plan) so we'll have the same problem. We can't force all clients to submit data at different times, so we really need to get it synchronized on the back end. Thanks for any help you can give, I'd be glad to supply any additional information that you could need!

    Read the article

  • Rolemanager not working when ASPDotNetStorefront served in a virtual folder with FormsAuthentication

    - by digiguru
    Does anyone know if there is a valid roleManager I can apply to ASPDotNetStorefront to get the site working? We have a website that has an ASP.Net storefront served in a virtual folder off the root. ourwebsite.com ourwebsite.com/shop Everything has been workign fine until we put the groundwork in place for forms authentication in the website recently. This caused an error on the production server when you tried to get into the shop... Unable to cast object of type 'System.Web.Security.RolePrincipal' to type 'AspDotNetStorefrontCore.AspDotNetStorefrontPrincipal'. Looking at the shop's web.config I noticed there was no RoleManager node, so I tried to fix the problem by removing it in the shop's web.config <roleManager enabled="false"> </roleManager> This prevented the error occurring, but also prevented the shopping basket from working. Instead I removed the rolemanager tag from the root website... <!-- Conflicting with AspDotNetStorefront <roleManager enabled="true" defaultProvider="CustomizedRoleProvider"> <providers> <add name="CustomizedRoleProvider" type="System.Web.Security.SqlRoleProvider" connectionStringName="constr" /> </providers> </roleManager> --> This worked, but obviously prevents role authentication working on the root website, which is okay for the next 2 or 3 releases. Anyone know the correct code for the rolemanager in DotNetStorefront?

    Read the article

  • How can an iPhone access another non-iPhone device over wireless or bluetooth?

    - by Tai Squared
    I'm trying to figure out if an iPhone can connect to another non-iPhone device over wireless or bluetooth and have seen conflicting information. Much of what I've found was before version 3.0 of the SDK came out, when it certainly wasn't possible. Looking at questions like this mention you can't connect to an arbitrary device unless if it's part of the "Works for iPhone" device. Do I need hardware that is part of this program? Looking through the Apple documentation, it mentions connecting two iPhones, not an iPhone to another Bluetooth device. Then there are articles like this that includes this quote ...and with the newly-announced "standard support" should allow file transfer between the iPhone and a computer, as well as between nearby iPhones Other questions mention Bonjour, and the Apple documentation talk about connecting to Bonjour devices, but can an iPhone connect to any Bonjour device? Does it have to have a wifi connection, or can it use Bluetooth? Even if I could use Bluetooth to connect to another device, it won't be available on first generation iPhones and iTouches, I believe. Is that correct? I'm thinking of an iPhone application that would need to communicate with other non-iPhone devices in the area, probably using Bluetooth, but possibly a direct wireless connection. What are the possibilities and limitations of this approach? Is it not possible to have an iPhone connect to an arbitrary Bluetooth device? Does the other device have to be on a wireless Bonjour network that? I'm trying to figure out if it's even possible for this to work or if it's not worth the effort.

    Read the article

  • Open source Java CMS for Google App Engine?

    - by markvgti
    I am looking for an open source Java CMS (Web CMS, actually) to run on Google App Engine. I have looked at related older questions on this topic (What CMS runs on Google AppEngine?, CMS over Google App Engine, with SEO etc.) but the problem is that they all largely list Python-based CMSes. Plus these questions are pretty old, and since GAE is a fast-moving target, I thought it might be worthwhile to ask again. I want a CMS for creating some websites (for myself and for others), but would rather not start writing one from scratch. A "good" (very subjective, I know) open source WCMS allows me to start using a product, while still being able to add to/extend the product/project. On the one hand I am looking for a somewhat mature product/project, on the other hand it's easier to start contributing to the development cycle of a young product/project (conflicting, I know :-). Here are some features that would be preferable: [X]HTML/XML/CSS based templating Ability to create multiple blogs Galleries Ability to create a "Downloads" section (is this pretty much standard?) Separate management for digital assets (images, PDFs, binary files etc.) Roles like "Administrator", "Editor", "Contributor" etc. (or their equivalents) Ability to move/reorganize pages Export to PDF Reformat content for printing Is the CMS you are about to suggest especially well-suited to publishing an online book? My idea is that while the book may be offered as a downloadable eBook, the latest, most current version will be the one available on the website.

    Read the article

  • PHP Dev Tools (Eclipse Plugin) -- Installation Error... help?

    - by Sean Ochoa
    I have already installed WEb Tools, PyDev, and the default Eclipse installation for Ubuntu 10.04 (using "sudo apt-get install eclipe"). I'm now trying to install PHP dev tools plug-in for Eclipse, and I'm getting this error msg: Cannot complete the install because of a conflicting dependency. Software being installed: PDT SDK Feature 1.0.5.v20081126-1856 (org.eclipse.php.sdk_feature.feature.group 1.0.5.v20081126-1856) Software currently installed: Eclipse XML Editors and Tools SDK 3.1.1.v200907161031-7A228DXETAqLQFBNMuHkC8-_dRPY (org.eclipse.wst.xml_sdk.feature.feature.group 3.1.1.v200907161031-7A228DXETAqLQFBNMuHkC8-_dRPY) Only one of the following can be installed at once: Eclipse XML Editors and Tools 3.1.1.v200907161031-7H6FMbDxtkMs9OeLGF98LRhdPKeo (org.eclipse.wst.xml_ui.feature.feature.jar 3.1.1.v200907161031-7H6FMbDxtkMs9OeLGF98LRhdPKeo) Eclipse XML Editors and Tools 3.0.4.v200811211541-7F2ENnCwum8W79A1UYNgSjOcFVJg (org.eclipse.wst.xml_ui.feature.feature.jar 3.0.4.v200811211541-7F2ENnCwum8W79A1UYNgSjOcFVJg) Cannot satisfy dependency: From: PDT SDK Feature 1.0.5.v20081126-1856 (org.eclipse.php.sdk_feature.feature.group 1.0.5.v20081126-1856) To: org.eclipse.php_feature.feature.group [1.0.5.v20081126-1856] Cannot satisfy dependency: From: PDT Feature 1.0.5.v20081126-1856 (org.eclipse.php_feature.feature.group 1.0.5.v20081126-1856) To: org.eclipse.wst.feature.group [3.0.0,4.0.0) Cannot satisfy dependency: From: Web Developer Tools 3.0.4.v200811190840-7A-8l8Qqcz0HyVgjXUE-iuOYZ9ai (org.eclipse.wst.feature.group 3.0.4.v200811190840-7A-8l8Qqcz0HyVgjXUE-iuOYZ9ai) To: org.eclipse.wst.xml_ui.feature.feature.group [3.0.4.v200811211541-7F2ENnCwum8W79A1UYNgSjOcFVJg] Cannot satisfy dependency: From: Eclipse XML Editors and Tools SDK 3.1.1.v200907161031-7A228DXETAqLQFBNMuHkC8-_dRPY (org.eclipse.wst.xml_sdk.feature.feature.group 3.1.1.v200907161031-7A228DXETAqLQFBNMuHkC8-_dRPY) To: org.eclipse.wst.xml_ui.feature.feature.group [3.1.1.v200907161031-7H6FMbDxtkMs9OeLGF98LRhdPKeo] Cannot satisfy dependency: From: Eclipse XML Editors and Tools 3.0.4.v200811211541-7F2ENnCwum8W79A1UYNgSjOcFVJg (org.eclipse.wst.xml_ui.feature.feature.group 3.0.4.v200811211541-7F2ENnCwum8W79A1UYNgSjOcFVJg) To: org.eclipse.wst.xml_ui.feature.feature.jar [3.0.4.v200811211541-7F2ENnCwum8W79A1UYNgSjOcFVJg] Cannot satisfy dependency: From: Eclipse XML Editors and Tools 3.1.1.v200907161031-7H6FMbDxtkMs9OeLGF98LRhdPKeo (org.eclipse.wst.xml_ui.feature.feature.group 3.1.1.v200907161031-7H6FMbDxtkMs9OeLGF98LRhdPKeo) To: org.eclipse.wst.xml_ui.feature.feature.jar [3.1.1.v200907161031-7H6FMbDxtkMs9OeLGF98LRhdPKeo] Any ideas?

    Read the article

  • NoSuchProviderException: smtp with log4j SMTP appender

    - by user1016403
    I am using log4j to send an email when there is an exception. below is my log4j properties file configuration. log4j.rootLogger=WARN, R, email log4j.appender.R=org.apache.log4j.ConsoleAppender log4j.appender.R.layout=org.apache.log4j.PatternLayout log4j.appender.R.layout.ConversionPattern=%d{HH:mm:ss} %-5p [%c{1}]: %m%n log4j.appender.email=org.apache.log4j.net.SMTPAppender log4j.appender.email.BufferSize=10 log4j.appender.email.SMTPHost=myhost.com [email protected] [email protected] log4j.appender.email.Subject=Error log4j.appender.email.layout=org.apache.log4j.PatternLayout mine is maven project i have added dependencies for mail.jar, activation.jar and smtp.jar. But on application server startup itself i get below error: [ERROR] log4j:ERROR Error occured while sending e-mail notification. [ERROR] javax.mail.NoSuchProviderException: smtp [ERROR] at javax.mail.Session.getService(Session.java:782) [ERROR] at javax.mail.Session.getTransport(Session.java:708) [ERROR] at javax.mail.Session.getTransport(Session.java:651) [ERROR] at javax.mail.Session.getTransport(Session.java:631) [ERROR] at javax.mail.Session.getTransport(Session.java:686) [ERROR] at javax.mail.Transport.send0(Transport.java:166) Am i missing any thing here? What is the root cause of the error? is it because of incorrect SMTP host name? or is it because of any missing/conflicting dependencies?

    Read the article

  • error with "pmem.c" compiling linux source code for android

    - by Preetam
    I am compiling linux source code for android emulator. When i execute make command(for building and cross-compiling the linux source) i get the following error "pmem.c" file. root@ubuntu:~/common# make CHK include/linux/version.h CHK include/linux/utsrelease.h SYMLINK include/asm - include/asm-x86 CALL scripts/checksyscalls.sh CHK include/linux/compile.h CC drivers/misc/pmem.o drivers/misc/pmem.c:441: error: conflicting types for ‘phys_mem_access_prot’ /home/preetam/common/arch/x86/include/asm/pgtable.h:383: note: previous declaration of ‘phys_mem_access_prot’ was here drivers/misc/pmem.c: In function ‘flush_pmem_file’: drivers/misc/pmem.c:805: error: implicit declaration of function ‘dmac_flush_range’ drivers/misc/pmem.c: In function ‘pmem_setup’: drivers/misc/pmem.c:1265: error: implicit declaration of function ‘ioremap_cached’ drivers/misc/pmem.c:1266: warning: assignment makes pointer from integer without a cast make[2]: * [drivers/misc/pmem.o] Error 1 make[1]: [drivers/misc] Error 2 make: ** [drivers] Error 2 root@ubuntu:~/common# how to resolve this error. It seems that there may some problems in the "pmem.c" file and i'll have to choose different git repository. but that would be a very complex thing, as now i have already done most of the things till here. I might have to see correct version of this file. please someone tell what should i do? how to solve this errors. please help..thankyou!

    Read the article

  • realmethods plugin for eclipse

    - by user309944
    Hai friends I am trying to install realmethods plugin in eclipse. when i install realmethods it shows the following error Cannot complete the install because of a conflicting dependency. Software being installed: realMethods GAE Generator 1.0.0.201004110340 (aib_eclipse_feature.feature.group 1.0.0.201004110340) Software currently installed: Eclipse IDE for Java EE Developers 1.2.1.20090918-0703 (epp.package.jee 1.2.1.20090918-0703) Only one of the following can be installed at once: Common Navigator View 3.4.0.I20090525-2000 (org.eclipse.ui.navigator 3.4.0.I20090525-2000) Common Navigator View 3.4.2.M20100120-0800 (org.eclipse.ui.navigator 3.4.2.M20100120-0800) Common Navigator View 3.4.1.M20090911-1550 (org.eclipse.ui.navigator 3.4.1.M20090911-1550) Cannot satisfy dependency: From: realMethods GAE Generator 1.0.0.201004110340 (aib_eclipse_feature.feature.group 1.0.0.201004110340) To: org.eclipse.ui.navigator 3.4.2 Cannot satisfy dependency: From: Eclipse IDE for Java EE Developers 1.2.1.20090918-0703 (epp.package.jee 1.2.1.20090918-0703) To: org.eclipse.epp.package.jee.feature.feature.group [1.2.1.20090918-0703] Cannot satisfy dependency: From: Java EE IDE Feature 1.2.1.20090918-0703 (org.eclipse.epp.package.jee.feature.feature.group 1.2.1.20090918-0703) To: org.eclipse.platform.feature.group [3.5.1.R35x_v20090910-9gEeG1_FthkNDSP2odXdThaOu9GFDPn83DGB7] Cannot satisfy dependency: From: Eclipse Platform 3.5.1.R35x_v20090910-9gEeG1_FthkNDSP2odXdThaOu9GFDPn83DGB7 (org.eclipse.platform.feature.group 3.5.1.R35x_v20090910-9gEeG1_FthkNDSP2odXdThaOu9GFDPn83DGB7) To: org.eclipse.ui.navigator [3.4.1.M20090911-1550] what shall i do. can any one help me. thanks in advance

    Read the article

  • Error displaying a WinForm in Design mode with a custom control on it.

    - by George
    I have a UserControl that is part of a Class library. I reference this project from my solution. This adds a control from the referenced project to my toolbox. I add tghe control to a form. Everything looks good, I compile all and run. Perfect... But when I close the .frm with the control on it and re-open it, I get this error. The code continues to run. It may have something to do with namespaces. The original namespace was simply "Design" and this was ambiguous and conflicting so i decided to rename it. I think that's when my problems began. To prevent possible data loss before loading the designer, the following errors must be resolved: 2 Errors Ignore and Continue Why am I seeing this page? Could not find type 'Besi.Winforms.HtmlEditor.Editor'. Please make sure that the assembly that contains this type is referenced. If this type is a part of your development project, make sure that the project has been successfully built using settings for your current platform or Any CPU. Instances of this error (1) 1. There is no stack trace or error line information available for this error. Help with this error Could not find an associated help topic for this error. Check Windows Forms Design-Time error list Forum posts about this error Search the MSDN Forums for posts related to this error The variable 'Editor1' is either undeclared or was never assigned. Go to code Instances of this error (1) 1. BesiAdmin frmOrder.Designer.vb Line:775 Column:1 Show Call Stack at System.ComponentModel.Design.Serialization.CodeDomSerializerBase.Error(IDesignerSerializationManager manager, String exceptionText, String helpLink) at System.ComponentModel.Design.Serialization.CodeDomSerializerBase.DeserializeExpression(IDesignerSerializationManager manager, String name, CodeExpression expression) at System.ComponentModel.Design.Serialization.CodeDomSerializerBase.DeserializeExpression(IDesignerSerializationManager manager, String name, CodeExpression expression) at System.ComponentModel.Design.Serialization.CodeDomSerializerBase.DeserializeStatement(IDesignerSerializationManager manager, CodeStatement statement) Help with this error MSDN Help Forum posts about this error Search the MSDN Forums for posts related to this error

    Read the article

  • Why does my J2ME DateField not display the correct date?

    - by olly
    I am storing values and date values in a record store. I have my date field set up like this: StartDate = new DateField("Start Date ", DateField.DATE); cal1 = Calendar.getInstance(); cal1.set(Calendar.YEAR, 2009); cal1.set(Calendar.MONTH, 0); cal1.set(Calendar.DAY_OF_MONTH, 1); StartDate.setDate(cal1.getTime()); and I save the date as a string as follows: strStartDate = cal1.get(cal1.DAY_OF_MONTH) + "/" + (cal1.get(cal1.MONTH) + 1) + "/" + cal1.get(cal1.YEAR); String detailsToAdd = strStartDate (I have shortened the code.) Now, I want to be able to edit the date at a future stage. However, I need the code to be able to do this. So far I have: EStartDate = new DateField("Start Date ", DateField.DATE); I had to change the name of the DateField box as this was conflicting with other things. I basically need to be able to show the selected record's date attribute. I currently have the other information displayed. I just need to be able to show the correct date. When I run the program the date field says <date>. Any help will be nice

    Read the article

  • Learning about the low level

    - by Anoners
    I'm interested in learning more about the PC from a lower (machine) level. I graduated from a school which taught us concepts using the Java language which abstracted out that level almost completely. As a result I only learned a bit from the one required assembly language course. In order to cram in ASM and quite a few details about architecture, it was hard to get a very deep picture of what is going on there. At work I focus on unix socket programming in C, so i'm much closer to the hardware now, but I feel I should learn a bit more about what streams really are, how memory management and paging works, what goes on when you call "paint()" on a graphics buffer, etc. I missed out on a lot of this and i'm looking for a good resource to get me started. I've heard a lot about the "Pink Book" by Peter Norton (Programmer's Guide to the IBM PC, Programmer's Guide to inside the PC, etc). It seems like this is on the right track, however the original is quite out dated and the newer ones have had conflicting reviews, with many people saying to stay away from it. I'm not sure what the SO crowd thinks about this book or if they have some suggestions for similar books, online resources, etc that may be good primers for this sort of thing. Any suggestions would be appreciated.

    Read the article

  • Eclipse plugin installation/update issues

    - by The Elite Gentleman
    I've installed the following Team repository plugins (along with it's dependencies) for Eclipse Helios (using Eclipse updater). MercurialEclipse 1.7.1 Subclipse 1.6.17 Subversive SVN All of these are the latest in Eclipse Marketplace. My problem is when I go to Eclipse "Preferences", under "Team" I only see CVS but under Eclipse Marketplace, I can see that these plugins are installed (it gives me an option to uninstall it). How do I configure my Team repositories to reflect under "Team" in Preferences? Also, there is an update for "Eclipse IDE for Java EE developers, but when I try to update it, the following error occurs: Cannot complete the install because of a conflicting dependency. Software being installed: Eclipse IDE for Java EE Developers 1.3.2.20110301-1807 (epp.package.jee 1.3.2.20110301-1807) Software currently installed: Shared profile 1.0.0.1276787175574 (SharedProfile_epp.package.jee 1.0.0.1276787175574) Only one of the following can be installed at once: toolingepp.package.jee.configuration 1.3.2.20110301-1807 toolingepp.package.jee.configuration 1.3.0.20100617-0521 Cannot satisfy dependency: From: Shared profile 1.0.0.1276787175574 (SharedProfile_epp.package.jee 1.0.0.1276787175574) To: toolingepp.package.jee.configuration [1.3.0.20100617-0521] Cannot satisfy dependency: From: Eclipse IDE for Java EE Developers 1.3.2.20110301-1807 (epp.package.jee 1.3.2.20110301-1807) To: toolingepp.package.jee.configuration [1.3.2.20110301-1807] How do I solve it? Yes, I've spent days Googling for this issue but none solved my problem. Thanks in advance.

    Read the article

  • Strange MSI error when setup.exe is run

    - by Martin Jackson
    We're using Visual Studio 2008's Setup Project to create an installer for our .NET 3.5 app. We host the .exe and .msi files on a website for our client to access, and produce new ones regularly to provide updates. This has all been fine until recently we've noticed some cases where installing via the .exe fails. The symptoms are: The .exe downloads fine, and runs fine. It appears to download the .msi successfully (the "downloading application files" step plods through happily), but then when it gets to the end of the "preparing to install" step, instead of launching the installer UI it pops up a message saying "This installation package could not be opened. Verify that the package exists and that you can access it, or contact the application vendor to verify that this is a valid Windows Installer package". You'd think that the .msi is just corrupt or something, but running it explicitly (even downloading it from the same location as the .exe to do so) works just fine. This problem is occurring on just some of our machines, which are running a mixture of XP and Windows7. The only pattern I can see in those machines that experience the problem is that they tend to have had the application installed on them longer (i.e. updating the app rather than installing for the first time). It seems to me that it might be something to do with how/where the .exe downloads the .msi to, and perhaps different versions are conflicting there? Has anyone experienced this before? Does anyone know where the installer .exe puts the .msi that it downloads?

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >