Search Results

Search found 31945 results on 1278 pages for 'database comparison'.

Page 52/1278 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Technical workshop with the gurus: Architecting Oracle Database-As-A-Service (DBaaS)

    - by Javier Puerta
    Hardware and Software, Engineered to Work Together inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- OCTOBER 2013 Invitation: Architecting Oracle Database-As-A-Service (DBaaS) Stay Connected Sign up for Specific Updates Architecting Oracle Database-As-A-Service (DBaaS) Dear partner, We are pleased to invite you to a 2-day workshop dedicated to EMEA partners on "Architecting Oracle Private Database Cloud & Delivering Database-As-A-Service (DBaaS)". This exclusive workshop will be delivered by Product Management and Product Development from Oracle HQ and focuses on the main theme CIOs are tackling with in the last decade: Consolidation to Private Cloud. For many customers the journey to consolidation has led to DBaaS Cloud deployments to significantly reduce costs and offer agile IT services. With the recent launch of Oracle Database 12c, the game really has changed in terms of what Oracle offers and how database clouds can be deployed. REGISTER NOW Who should attend: Enterprise Architects Infrastructure Architects DB Architects from System Integrators and large Independent Software Vendors. Take this opportunity to learn from the gurus, how you can help your customers maximize on their cloud consolidation strategies. The workshops main focus is service delivery, which includes standardization and consolidation, and how you would help your customers transform their current IT infrastructure to a service delivery model. It will discuss best practices and reviews customer examples that have successfully implemented a database cloud. The agenda is split into two days sessions: Day 1: Overview & Planning Database Cloud - Demos Customer Case Studies Database 12c Day 2: Database Cloud - Design Database Cloud - Implementation EM Cloud Control DBaaS on Engineered Systems Question and Answers Attendance is free of charge for qualified Oracle partners - Register now for one of the below sessions: Date Country Location 5 & 6 November 2013  United Kingdom   Manchester 7 & 8 November 2013  Germany  Munich 11 & 12 November 2013  Netherlands  Amsterdam 14 & 15 November 2013  Turkey Istanbul 18 & 19 November 2013  Austria Vienna Looking forward to seeing you! Javier Puerta Director, Core Technology Partner Programs EMEA Prashant Barot Director, Core Technology Resources OPN Portal OPN Enablement News Blog Oracle Partner Store Use Oracle Trademark in Google AdWords OPN Events Calendar OPN Information Center OPN Solutions Catalog Promote Your Events on Oracle Calendar Copyright © 2013, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • Technical workshop with the gurus: Architecting Oracle Database-As-A-Service (DBaaS)

    - by Javier Puerta
    Hardware and Software, Engineered to Work Together inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- OCTOBER 2013 Invitation: Architecting Oracle Database-As-A-Service (DBaaS) Stay Connected Sign up for Specific Updates Architecting Oracle Database-As-A-Service (DBaaS) Dear partner, We are pleased to invite you to a 2-day workshop dedicated to EMEA partners on "Architecting Oracle Private Database Cloud & Delivering Database-As-A-Service (DBaaS)". This exclusive workshop will be delivered by Product Management and Product Development from Oracle HQ and focuses on the main theme CIOs are tackling with in the last decade: Consolidation to Private Cloud. For many customers the journey to consolidation has led to DBaaS Cloud deployments to significantly reduce costs and offer agile IT services. With the recent launch of Oracle Database 12c, the game really has changed in terms of what Oracle offers and how database clouds can be deployed. REGISTER NOW Who should attend: Enterprise Architects Infrastructure Architects DB Architects from System Integrators and large Independent Software Vendors. Take this opportunity to learn from the gurus, how you can help your customers maximize on their cloud consolidation strategies. The workshops main focus is service delivery, which includes standardization and consolidation, and how you would help your customers transform their current IT infrastructure to a service delivery model. It will discuss best practices and reviews customer examples that have successfully implemented a database cloud. The agenda is split into two days sessions: Day 1: Overview & Planning Database Cloud - Demos Customer Case Studies Database 12c Day 2: Database Cloud - Design Database Cloud - Implementation EM Cloud Control DBaaS on Engineered Systems Question and Answers Attendance is free of charge for qualified Oracle partners - Register now for one of the below sessions: Date Country Location 5 & 6 November 2013  United Kingdom   Manchester 7 & 8 November 2013  Germany  Munich 11 & 12 November 2013  Netherlands  Amsterdam 14 & 15 November 2013  Turkey Istanbul 18 & 19 November 2013  Austria Vienna Looking forward to seeing you! Javier Puerta Director, Core Technology Partner Programs EMEA Prashant Barot Director, Core Technology     Resources OPN Portal OPN Enablement News Blog Oracle Partner Store Use Oracle Trademark in Google AdWords OPN Events Calendar OPN Information Center OPN Solutions Catalog Promote Your Events on Oracle Calendar Copyright © 2013, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • Ask the Readers: Favorite Deal App?

    - by Jason Fitzpatrick
    Black Friday is mere days away and the holiday shopping season is upon us. This week we want to hear about your favorite deals apps—how do you find the best deals on the go? The parameters for this week’s Ask the Readers question are pretty straight forward: we want to hear about your favorite deal finding/deal comparison smart phone app. Whether it’s for Android, iOS, or a web site with a very cleanly formatted mobile interface, we want to hear about how you score serious deals out in the field. How do you know if the deal you see in the store is the best deal? Sound off in the comments with your favorite deal app and then check in on Friday for the What You Said roundup. How to See What Web Sites Your Computer is Secretly Connecting To HTG Explains: When Do You Need to Update Your Drivers? How to Make the Kindle Fire Silk Browser *Actually* Fast!

    Read the article

  • Are Language Comparisons Meaningful?

    - by Prasoon Saurav
    Dr Bjarne Stroustrup in his book D&E says Several reviewers asked me to compare C++ to other languages. This I have decided against doing. Thereby, I have reaffirmed a long-standing and strongly held view: "Language comparisons are rarely meaningful and even less often fair" . A good comparison of major programming languages requires more effort than most people are willing to spend, experience in a wide range of application areas, a rigid maintenance of a detached and impartial point of view, and a sense of fairness. I do not have the time, and as the designer of C++, my impartiality would never be fully credible. -- The Design and Evolution of C++(Bjarne Stroustrup) Do you people agree with his this statement "Language comparisons are rarely meaningful and even less often fair"? Personally I think that comparing a language X with Y makes sense because it gives many more reasons to love/despise X/Y :-P What do you people think?

    Read the article

  • Deleting huge chunks of data from mysql innodb

    - by mingyeow
    I need to delete a huge chunk of my data in my production database, which runs about 100GB in size. If possible, i would like to minimize my downtime. My selection criteria for deleting is likely to be DELETE * FROM POSTING WHERE USER.ID=5 AND UPDATED_AT<100 What is the best way to delete it? Build an index? Write a sequential script that deletes via paginating through the rows 1000 at a time?

    Read the article

  • Oracle WebLogic Server and Oracle Database: A Robust Infrastructure for your Applications

    - by Ruma Sanyal
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 It has been said that a chain is as strong as its weakest link. Well, this is also true for your application infrastructure. Not only are the various components that constitute your infrastructure, like database and application server critical, the integration between these things [whether coming out of the box from your vendor or done in-house] is paramount. Imagine your database being down and your application server not knowing about it and as a result your application waiting indefinitely for a database response – not a great situation if high availability is critical to your application. Or one of your database nodes is very busy, but your application server doesn’t have the intelligence to decipher that – it keeps pinging the busy node when it can in fact get a response from another idle node much faster. This is what Oracle WebLogic and Database integration provides: Intelligent integration out of the box. Tight integration between Oracle WebLogic and Database makes your infrastructure robust enough that not only does each of your infrastructure component provide you with improved RASP [reliability availability, scalability, and performance] but these components work together to offer improved performance & availability, better resource sharing, inherent scalability, ease of configuration and automated management for your entire infrastructure. Oracle WebLogic Server is the only application server with this degree of integration to Oracle Database. With Oracle WebLogic Server 11g, we introduced Active GridLink for Real Application Clusters (RAC). In conjunction with Oracle Database, this powerful software technology simplifies management, increases availability, and ensures fast connection failover with runtime connection, load balancing and affinity capabilities. With the release of Oracle Database 12c this summer, even tighter integration between Oracle WebLogic Server 12c (12.1.2) and Oracle Database 12c has been achieved and this further optimizes the integration for a global cloud environment. Read about these capabilities in detail in the Oracle WebLogic-Database Integration Whitepaper. Get in depth ‘how-to’ details from this YouTube video on the topic from our resident expert, Monica Roccelli. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • What's New in Database Lifecycle Management in Enterprise Manager 12c Release 3

    - by HariSrinivasan
    Enterprise Manager 12c Release 3 includes improvements and enhancements across every area of the product. This blog provides an overview of the new and enhanced features in the Database Lifecycle Management area. I will deep dive into specific features more in depth in subsequent posts. "What's New?"  In this release, we focused on four things: 1. Lifecycle Management Support for new Database12c - Pluggable Databases 2. Management of long running processes, such as a security patch cycle (Change Activity Planner) 3. Management of large number of systems by · Leveraging new framework capabilities for lifecycle operations, such as the new advanced ‘emcli’ script option · Refining features such as configuration search and compliance 4. Minor improvements and quality fixes to existing features · Rollback support for Single instance databases · Improved "OFFLINE" Patching experience · Faster collection of ORACLE_HOME configurations Lifecycle Management Support for new Database 12c - Pluggable Databases Database 12c introduces Pluggable Databases (PDBs), the brand new addition to help you achieve your consolidation goals. Pluggable databases offer unprecedented consolidation at database level and native lifecycle verbs for creating, plugging and unplugging the databases on a container database (CDB). Enterprise Manager can supplement the capabilities of pluggable databases by offering workflows for migrating, provisioning and cloning them using the software library and the deployment procedures. For example, Enterprise Manager can migrate an existing database to a PDB or clone a PDB by storing a versioned copy in the software library. One can also manage the planned downtime related to patching by  migrating the PDBs to a new CDB. While pluggable databases offer these exciting features, it can also pose configuration management and compliance challenges if not managed properly. Enterprise Manager features like inventory management, topology associations and configuration search can mitigate the sprawl of PDBs and also lock them to predefined golden standards using configuration comparison and compliance rules. Learn More ... Management of Long Running datacenter processes - Change Activity Planner (CAP) Currently, customers resort to cumbersome methods to create, execute, track and monitor change activities within their data center. Some customers use traditional tools such as spreadsheets, project planners and in-house custom built solutions. Customers often have weekly sync up meetings across stake holders to collect status and updates. Some of the change activities, for example the quarterly patch set update (PSU) patch rollouts are not single tasks but processes with multiple tasks. Some of those tasks are performed within Enterprise Manager Cloud Control (for example Patch) and some are performed outside of Enterprise Manager Cloud Control. These tasks often run for a longer period of time and involve multiple people or teams. Enterprise Manger Cloud Control supports core data center operations such as configuration management, compliance management, and automation. Enterprise Manager Cloud Control release 12.1.0.3 leverages these capabilities and introduces the Change Activity Planner (CAP). CAP provides the ability to plan, execute, and track change activities in real time. It covers the typical datacenter activities that are spread over a long period of time, across multiple people and multiple targets (even target types). Here are some examples of Change Activity Process in a datacenter: · Patching large environments (PSU/CPU Patching cycles) · Upgrading large number of database environments · Rolling out Compliance Rules · Database Consolidation to Exadata environments CAP provides user flows for Compliance Officers/Managers (incl. lead administrators) and Operators (DBAs and admins). Managers can create change activity plans for various projects, allocate resources, targets, and groups affected. Upon activation of the plan, tasks are created and automatically assigned to individual administrators based on target ownership. Administrators (DBAs) can identify their tasks and understand the context, schedules, and priorities. They can complete tasks using Enterprise Manager Cloud Control automation features such as patch plans (or in some cases outside Enterprise Manager). Upon completion, compliance is evaluated for validations and updates the status of the tasks and the plans. Learn More about CAP ...  Improved Configuration & Compliance Management of a large number of systems Improved Configuration Comparison:  Get to the configuration comparison results faster for simple ad-hoc comparisons. When performing a 1 to 1 comparison, Enterprise Manager will perform the comparison immediately and take the user directly to the results without having to wait for a job to be submitted and executed. Flattened system comparisons reduce comparison setup time and reduce complexity. In addition to the previously existing topological comparison, users now have an option to compare using a “flattened” methodology. Flattening means to remove duplicate target instances within the systems and remove the hierarchy of member targets. The result are much easier to spot differences particularly for specific use cases like comparing patch levels between complex systems like RAC and Fusion Apps. Improved Configuration Search & Advanced EMCLI Script option for Mass Automation Enterprise manager 12c introduces a new framework level capability to be able to script and stitch together multiple tasks using EMCLI. This powerful capability can be leveraged for lifecycle operations, especially when executing a task over a large number of targets. Specific usages of this include, retrieving a qualified list of targets using Configuration Search and then using the resultset for automation. Another example would be executing a patching operation and then re-executing on targets where it may have failed. This is complemented by other enhancements, such as a better usability for designing reusable configuration searches. IN EM 12c Rel 3, a simplified UI makes building adhoc searches even easier. Searching for missing patches is a common use of configuration search. This required the use of the advanced options which are now clearly defined and easy to use. Perform “Configuration Search” using the EMCLI. Users can find and execute Configuration Searches from the EMCLI which can be extremely useful for building sophisticated automation scripts. For an example, Run the Search named “Oracle Databases on Exadata” which finds all Database targets running on top of Exadata. Further filter the results by refining by options like name, host, etc.. emcli get_targets -config_search="Databases on Exadata" –target_name="exa%“ Use this in powerful mass automation operations using the new emcli script option. For example, to solve the use case of – Finding all DBs running on Exadata and housing E-Biz and Patch them. Create a Python script with emcli functions and invoke it in the new EMCLI script option shell. Invoke the script in the new EMCLI with script option directly: $<path to emcli>/emcli @myPSU_Patch.py Richer compliance content:  Now over 50 Oracle Provided Compliance Standards including new standards for Pluggable Database, Fusion Applications, Oracle Identity Manager, Oracle VM and Internet Directory. 9 Oracle provided Real Time Monitoring Standards containing over 900 Compliance Rules across 500 Facets. These new Real time Compliance Standards covers both Exadata Compute nodes and Linux servers. The result is increased Oracle software coverage and faster time to compliance monitoring on Exadata. Enhancements to Patch Management: Overhauled "OFFLINE" Patching experience: Simplified Patch uploads UI to improve the offline experience of patching. There is now a single step process to get the patches into software library. Customers often maintain local repositories of patches, sometimes called software depots, where they host the patches downloaded from My Oracle Support. In the past, you had to move these patches to your desktop then upload them to the Enterprise Manager's Software library through the Enterprise Manager Cloud Control user interface. You can now use the following EMCLI command to upload multiple patches directly from a remote location within the data center: $emcli upload_patches -location <Path to Patch directory> -from_host <HOSTNAME> The upload process filters all of the new patches, automatically selects the relevant metadata files from the location, and uploads the patches to software library. Other Improvements:  Patch rollback for single instance databases, new option in the Patch Plan to rollback the patches added to the patch plans. Upon execution, the procedure would rollback the patch and the SQL applied to the single instance Databases. Improved and faster configuration collection of Oracle Home targets can enable more reliable automation at higher level functions like Provisioning, Patching or Database as a Service. Just to recap, here is a list of database lifecycle management features:  * Red highlights mark – New or Enhanced in the Release 3. • Discovery, inventory tracking and reporting • Database provisioning including o Migration to Pluggable databases o Plugging and unplugging of pluggable databases o Gold image based cloning o Scaling of RAC nodes •Schema and data change management •End-to-end patch management in online and offline modes, including o Patch advisories in online (connected with My Oracle Support) and offline mode o Patch pre-deployment analysis, deployment and rollback (currently only for single instance databases) o Reporting • Upgrade planning and execution of the upgrade process • Configuration management including • Compliance management with out-of-box content • Change Activity Planner for planning, designing and tracking long running processes For more information on Enterprise Manager’s database lifecycle management capabilities, visit http://www.oracle.com/technetwork/oem/lifecycle-mgmt/index.html

    Read the article

  • What parameters to use to compare GUI frameworks / toolkits?

    - by gooli
    I'm doing some research on the best GUI toolkit to use for future products at the company. We're talking about a fairly large organizations with quite a bit of code and a complete rewrite project in planning. Don't ask. Anyway, I'm trying to create a list relevant parameters to judge the toolkits. What would you use to drive the comparison? Here's what I've got so far: Maturity Ease of development Ease of prototyping Ease of maintenance Size of hiring pool Available knowledge at the company Training costs Community size Community level of expertise (how hard to find good answers to complex problems) Amount of expert-level books available Ability to interface to other technologies Deployment considerations Visual aesthetics Ability to access OS resources Multiple monitor support (something that might come in handy in our particular application)

    Read the article

  • Consolidation in a Database Cloud

    - by B R Clouse
    Consolidation of multiple databases onto a shared infrastructure is the next step after Standardization.  The potential consolidation density is a function of the extent to which the infrastructure is shared.  The three models provide increasing degrees of sharing: Server: each database is deployed in a dedicated VM. Hardware is shared, but most of the software infrastructure is not. Standardization is often applied incompletely since operating environments can be moved as-is onto the shared platform. The potential for VM sprawl is an additional downside. Database: multiple database instances are deployed on a shared software / hardware infrastructure. This model is very efficient and easily implemented with the features in the Oracle Database and supporting products. Many customers have moved to this model and achieved significant, measurable benefits. Schema: multiple schemas are deployed within a single database instance. The most efficient model, it places constraints on the environment. Usually this model will be implemented only by customers deploying their own applications.  (Note that a single deployment can combine Database and Schema consolidations.) Customer value: lower costs, better system utilization In this phase of the maturity model, under-utilized hardware can be used to host more workloads, or retired and those workloads migrated to consolidation platforms. Customers benefit from higher utilization of the hardware resources, resulting in reduced data center floor space, and lower power and cooling costs. And, the OpEx savings from Standardization are multiplied, since there are fewer physical components (both hardware and software) to manage. Customer value: higher productivity The OpEx benefits from Standardization are compounded since not only are there fewer types of things to manage, now there are fewer entities to manage. In this phase, customers discover that their IT staff has time to move away from "day-to-day" tasks and start investing in higher value activities. Database users benefit from consolidating onto shared infrastructures by relieving themselves of the requirement to maintain their own dedicated servers. Also, if the shared infrastructure offers capabilities such as High Availability / Disaster Recovery, which are often beyond the budget and skillset of a standalone database environment, then moving to the consolidation platform can provide access to those capabilities, resulting in less downtime. Capabilities / Characteristics In this phase, customers will typically deploy fixed-size clusters and consolidate on a cluster until that cluster is deemed "full," at which point a new cluster is built. Customers will define one or a few cluster architectures that are used wherever possible; occasionally there may be deployments which must be handled as exceptions. The "full" policy may be based on number of databases deployed on the cluster, or observed peak workload, etc. IT will own the provisioning of new databases on a cluster, making the decision of when and where to place new workloads. Resources may be managed dynamically, e.g., as a priority workload increases, it may be given more CPU and memory to handle the spike. Users will be charged at a fixed, relatively coarse level; or in some cases, no charging will be applied. Activities / Tasks Oracle offers several tools to plan a successful consolidation. Real Application Testing (RAT) has a feature to help plan and validate database consolidations. Enterprise Manager 12c's Cloud Management Pack for Database includes a planning module. Looking ahead, customers should start planning for the Services phase by defining the Service Catalog that will be made available for database services.

    Read the article

  • 10 Million records = wiped MySQL DB?

    - by Josh K
    So I was trying to load some test data and it appears to have killed my entire database. This is one case where it's great to have backups! They were all plain insert queries, probably about a 900 MB file. What could have gone wrong?

    Read the article

  • A format for storing personal contacts in a database

    - by Gart
    I'm thinking of the best way to store personal contacts in a database for a business application. The traditional and straightforward approach would be to create a table with columns for each element, i.e. Name, Telephone Number, Job title, Address, etc... However, there are known industry standards for this kind of data, like for example vCard, or hCard, or vCard-RDF/XML or even Windows Contacts XML Schema. Utilizing an standard format would offer some benefits, like inter-operablilty with other systems. But how can I decide which method to use? The requirements are mainly to store the data. Search and ordering queries are highly unlikely but possible. The volume of the data is 100,000 records at maximum. My database engine supports native XML columns. I have been thinking to use some XML-based format to store the personal contacts. Then it will be possible to utilize XML indexes on this data, if searching and ordering is needed. Is this a good approach? Which contacts format and schema would you recommend for this? Edited after first answers Here is why I think the straightforward approach is bad. This is due to the nature of this kind of data - it is not that simple. The personal contacts it is not well-structured data, it may be called semi-structured. Each contact may have different data fields, maybe even such fields which I cannot anticipate. In my opinion, each piece of this data should be treated as important information, i.e. no piece of data could be discarded just because there was no relevant column in the database. If we took it further, assuming that no data may be lost, then we could create a big text column named Comment or Description or Other and put there everything which cannot be fitted well into table columns. But then again - the data would lose structure - this might be bad. If we wanted structured data then - according to the database design principles - the data should be decomposed into entities, and relations should be established between the entities. But this adds complexity - there are just too many entities, and lots of design desicions should be made, like "How do we store Address? Personal Name? Phone number? How do we encode home phone numbers and mobile phone numbers? How about other contact info?.." The relations between entities are complex and multiple, and each relation is a table in the database. Each relation needs to be documented in the design papers. That is a lot of work to do. But it is possible to avoid the complexity entirely - just document that the data is stored according to such and such standard schema, period. Then anybody who would be reading that document should easily understand what it was all about. Finally, this is all about using an industry standard. The standard is, hopefully, designed by some clever people who anticipated and described the structure of personal contacts information much better than I ever could. Why should we all reinvent the wheel?? It's much easier to use a standard schema. The problem is, there are just too many standards - it's not easy to decide which one to use!

    Read the article

  • Database call crashes Android Application

    - by Darren Murtagh
    i am using a Android database and its set up but when i call in within an onClickListener and the app crashes the code i am using is mButton.setOnClickListener( new View.OnClickListener() { public void onClick(View view) { s = WorkoutChoice.this.weight.getText().toString(); s2 = WorkoutChoice.this.height.getText().toString(); int w = Integer.parseInt(s); double h = Double.parseDouble(s2); double BMI = (w/h)/h; t.setText(""+BMI); long id = db.insertTitle("001", ""+days, ""+BMI); Cursor c = db.getAllTitles(); if (c.moveToFirst()) { do { DisplayTitle(c); } while (c.moveToNext()); } } }); and the log cat for when i run it is: 04-01 18:21:54.704: E/global(6333): Deprecated Thread methods are not supported. 04-01 18:21:54.704: E/global(6333): java.lang.UnsupportedOperationException 04-01 18:21:54.704: E/global(6333): at java.lang.VMThread.stop(VMThread.java:85) 04-01 18:21:54.704: E/global(6333): at java.lang.Thread.stop(Thread.java:1391) 04-01 18:21:54.704: E/global(6333): at java.lang.Thread.stop(Thread.java:1356) 04-01 18:21:54.704: E/global(6333): at com.b00348312.workout.Splashscreen$1.run(Splashscreen.java:42) 04-01 18:22:09.444: D/dalvikvm(6333): GC_FOR_MALLOC freed 4221 objects / 252640 bytes in 31ms 04-01 18:22:09.474: I/dalvikvm(6333): Total arena pages for JIT: 11 04-01 18:22:09.574: D/dalvikvm(6333): GC_FOR_MALLOC freed 1304 objects / 302920 bytes in 29ms 04-01 18:22:09.744: D/dalvikvm(6333): GC_FOR_MALLOC freed 2480 objects / 290848 bytes in 33ms 04-01 18:22:10.034: D/dalvikvm(6333): GC_FOR_MALLOC freed 6334 objects / 374152 bytes in 36ms 04-01 18:22:14.344: D/AndroidRuntime(6333): Shutting down VM 04-01 18:22:14.344: W/dalvikvm(6333): threadid=1: thread exiting with uncaught exception (group=0x400259f8) 04-01 18:22:14.364: E/AndroidRuntime(6333): FATAL EXCEPTION: main 04-01 18:22:14.364: E/AndroidRuntime(6333): java.lang.IllegalStateException: database not open 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.database.sqlite.SQLiteDatabase.insertWithOnConflict(SQLiteDatabase.java:1567) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.database.sqlite.SQLiteDatabase.insert(SQLiteDatabase.java:1484) 04-01 18:22:14.364: E/AndroidRuntime(6333): at com.b00348312.workout.DataBaseHelper.insertTitle(DataBaseHelper.java:84) 04-01 18:22:14.364: E/AndroidRuntime(6333): at com.b00348312.workout.WorkoutChoice$3.onClick(WorkoutChoice.java:84) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.view.View.performClick(View.java:2408) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.view.View$PerformClick.run(View.java:8817) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.os.Handler.handleCallback(Handler.java:587) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.os.Handler.dispatchMessage(Handler.java:92) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.os.Looper.loop(Looper.java:144) 04-01 18:22:14.364: E/AndroidRuntime(6333): at android.app.ActivityThread.main(ActivityThread.java:4937) 04-01 18:22:14.364: E/AndroidRuntime(6333): at java.lang.reflect.Method.invokeNative(Native Method) 04-01 18:22:14.364: E/AndroidRuntime(6333): at java.lang.reflect.Method.invoke(Method.java:521) 04-01 18:22:14.364: E/AndroidRuntime(6333): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:858) 04-01 18:22:14.364: E/AndroidRuntime(6333): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616) 04-01 18:22:14.364: E/AndroidRuntime(6333): at dalvik.system.NativeStart.main(Native Method) i have notice errors when the application opens but i dont no what thet are from. when i take out the statements to do with the database there is no errors and everthign runs smoothly

    Read the article

  • Generating EF Code First model classes from an existing database

    - by Jon Galloway
    Entity Framework Code First is a lightweight way to "turn on" data access for a simple CLR class. As the name implies, the intended use is that you're writing the code first and thinking about the database later. However, I really like the Entity Framework Code First works, and I want to use it in existing projects and projects with pre-existing databases. For example, MVC Music Store comes with a SQL Express database that's pre-loaded with a catalog of music (including genres, artists, and songs), and while it may eventually make sense to load that seed data from a different source, for the MVC 3 release we wanted to keep using the existing database. While I'm not getting the full benefit of Code First - writing code which drives the database schema - I can still benefit from the simplicity of the lightweight code approach. Scott Guthrie blogged about how to use entity framework with an existing database, looking at how you can override the Entity Framework Code First conventions so that it can work with a database which was created following other conventions. That gives you the information you need to create the model classes manually. However, it turns out that with Entity Framework 4 CTP 5, there's a way to generate the model classes from the database schema. Once the grunt work is done, of course, you can go in and modify the model classes as you'd like, but you can save the time and frustration of figuring out things like mapping SQL database types to .NET types. Note that this template requires Entity Framework 4 CTP 5 or later. You can install EF 4 CTP 5 here. Step One: Generate an EF Model from your existing database The code generation system in Entity Framework works from a model. You can add a model to your existing project and delete it when you're done, but I think it's simpler to just spin up a separate project to generate the model classes. When you're done, you can delete the project without affecting your application, or you may choose to keep it around in case you have other database schema updates which require model changes. I chose to add the Model classes to the Models folder of a new MVC 3 application. Right-click the folder and select "Add / New Item..."   Next, select ADO.NET Entity Data Model from the Data Templates list, and name it whatever you want (the name is unimportant).   Next, select "Generate from database." This is important - it's what kicks off the next few steps, which read your database's schema.   Now it's time to point the Entity Data Model Wizard at your existing database. I'll assume you know how to find your database - if not, I covered that a bit in the MVC Music Store tutorial section on Models and Data. Select your database, uncheck the "Save entity connection settings in Web.config" (since we won't be using them within the application), and click Next.   Now you can select the database objects you'd like modeled. I just selected all tables and clicked Finish.   And there's your model. If you want, you can make additional changes here before going on to generate the code.   Step Two: Add the DbContext Generator Like most code generation systems in Visual Studio lately, Entity Framework uses T4 templates which allow for some control over how the code is generated. K Scott Allen wrote a detailed article on T4 Templates and the Entity Framework on MSDN recently, if you'd like to know more. Fortunately for us, there's already a template that does just what we need without any customization. Right-click a blank space in the Entity Framework model surface and select "Add Code Generation Item..." Select the Code groupt in the Installed Templates section and pick the ADO.NET DbContext Generator. If you don't see this listed, make sure you've got EF 4 CTP 5 installed and that you're looking at the Code templates group. Note that the DbContext Generator template is similar to the EF POCO template which came out last year, but with "fix up" code (unnecessary in EF Code First) removed.   As soon as you do this, you'll two terrifying Security Warnings - unless you click the "Do not show this message again" checkbox the first time. It will also be displayed (twice) every time you rebuild the project, so I checked the box and no immediate harm befell my computer (fingers crossed!).   Here's the payoff: two templates (filenames ending with .tt) have been added to the project, and they've generated the code I needed.   The "MusicStoreEntities.Context.tt" template built a DbContext class which holds the entity collections, and the "MusicStoreEntities.tt" template build a separate class for each table I selected earlier. We'll customize them in the next step. I recommend copying all the generated .cs files into your application at this point, since accidentally rebuilding the generation project will overwrite your changes if you leave them there. Step Three: Modify and use your POCO entity classes Note: I made a bunch of tweaks to my POCO classes after they were generated. You don't have to do any of this, but I think it's important that you can - they're your classes, and EF Code First respects that. Modify them as you need for your application, or don't. The Context class derives from DbContext, which is what turns on the EF Code First features. It holds a DbSet for each entity. Think of DbSet as a simple List, but with Entity Framework features turned on.   //------------------------------------------------------------------------------ // <auto-generated> // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // </auto-generated> //------------------------------------------------------------------------------ namespace EF_CodeFirst_From_Existing_Database.Models { using System; using System.Data.Entity; public partial class Entities : DbContext { public Entities() : base("name=Entities") { } public DbSet<Album> Albums { get; set; } public DbSet<Artist> Artists { get; set; } public DbSet<Cart> Carts { get; set; } public DbSet<Genre> Genres { get; set; } public DbSet<OrderDetail> OrderDetails { get; set; } public DbSet<Order> Orders { get; set; } } } It's a pretty lightweight class as generated, so I just took out the comments, set the namespace, removed the constructor, and formatted it a bit. Done. If I wanted, though, I could have added or removed DbSets, overridden conventions, etc. using System.Data.Entity; namespace MvcMusicStore.Models { public class MusicStoreEntities : DbContext { public DbSet Albums { get; set; } public DbSet Genres { get; set; } public DbSet Artists { get; set; } public DbSet Carts { get; set; } public DbSet Orders { get; set; } public DbSet OrderDetails { get; set; } } } Next, it's time to look at the individual classes. Some of mine were pretty simple - for the Cart class, I just need to remove the header and clean up the namespace. //------------------------------------------------------------------------------ // // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // //------------------------------------------------------------------------------ namespace EF_CodeFirst_From_Existing_Database.Models { using System; using System.Collections.Generic; public partial class Cart { // Primitive properties public int RecordId { get; set; } public string CartId { get; set; } public int AlbumId { get; set; } public int Count { get; set; } public System.DateTime DateCreated { get; set; } // Navigation properties public virtual Album Album { get; set; } } } I did a bit more customization on the Album class. Here's what was generated: //------------------------------------------------------------------------------ // // This code was generated from a template. // // Changes to this file may cause incorrect behavior and will be lost if // the code is regenerated. // //------------------------------------------------------------------------------ namespace EF_CodeFirst_From_Existing_Database.Models { using System; using System.Collections.Generic; public partial class Album { public Album() { this.Carts = new HashSet(); this.OrderDetails = new HashSet(); } // Primitive properties public int AlbumId { get; set; } public int GenreId { get; set; } public int ArtistId { get; set; } public string Title { get; set; } public decimal Price { get; set; } public string AlbumArtUrl { get; set; } // Navigation properties public virtual Artist Artist { get; set; } public virtual Genre Genre { get; set; } public virtual ICollection Carts { get; set; } public virtual ICollection OrderDetails { get; set; } } } I removed the header, changed the namespace, and removed some of the navigation properties. One nice thing about EF Code First is that you don't have to have a property for each database column or foreign key. In the Music Store sample, for instance, we build the app up using code first and start with just a few columns, adding in fields and navigation properties as the application needs them. EF Code First handles the columsn we've told it about and doesn't complain about the others. Here's the basic class: using System.ComponentModel; using System.ComponentModel.DataAnnotations; using System.Web.Mvc; using System.Collections.Generic; namespace MvcMusicStore.Models { public class Album { public int AlbumId { get; set; } public int GenreId { get; set; } public int ArtistId { get; set; } public string Title { get; set; } public decimal Price { get; set; } public string AlbumArtUrl { get; set; } public virtual Genre Genre { get; set; } public virtual Artist Artist { get; set; } public virtual List OrderDetails { get; set; } } } It's my class, not Entity Framework's, so I'm free to do what I want with it. I added a bunch of MVC 3 annotations for scaffolding and validation support, as shown below: using System.ComponentModel; using System.ComponentModel.DataAnnotations; using System.Web.Mvc; using System.Collections.Generic; namespace MvcMusicStore.Models { [Bind(Exclude = "AlbumId")] public class Album { [ScaffoldColumn(false)] public int AlbumId { get; set; } [DisplayName("Genre")] public int GenreId { get; set; } [DisplayName("Artist")] public int ArtistId { get; set; } [Required(ErrorMessage = "An Album Title is required")] [StringLength(160)] public string Title { get; set; } [Required(ErrorMessage = "Price is required")] [Range(0.01, 100.00, ErrorMessage = "Price must be between 0.01 and 100.00")] public decimal Price { get; set; } [DisplayName("Album Art URL")] [StringLength(1024)] public string AlbumArtUrl { get; set; } public virtual Genre Genre { get; set; } public virtual Artist Artist { get; set; } public virtual List<OrderDetail> OrderDetails { get; set; } } } The end result was that I had working EF Code First model code for the finished application. You can follow along through the tutorial to see how I built up to the finished model classes, starting with simple 2-3 property classes and building up to the full working schema. Thanks to Diego Vega (on the Entity Framework team) for pointing me to the DbContext template.

    Read the article

  • NSPredicateEditorRowTemplate for date comparison

    - by Dave DeLong
    I'm building an NSPredicateEditor, and I want the ability to do advanced date comparison. I realize that I can build an NSPredicateEditorRowTemplate with a rightExpressionType of NSDateAttributeType, but the predicates I want to build need to be much more advanced than that. For example, I need to basic comparison like: dateKeypath < aDate dateKeypath <= aDate dateKeypath = aDate dateKeypath != aDate dateKeypath > aDate dateKeypath >= aDate These basic comparisons are quite easy to achieve, and I have these working. However, I also need to do comparisons like: dateKeypath isInTheLast n days (or weeks, months, years) dateKeypath isNotInTheLast n days (or weeks, months, years) dateKeypath between aDate and anotherDate How can I achieve these sorts of comparisons? I understand that I'll need to create a custom NSPredicateEditorRowTemplate, but I haven't found any clear documentation on how to achieve something like this. EDIT Bonus points are available for also knowing how to make these comparisons a full date-time (year-month-day-hour-minute-second) comparison (as NSDateAttributeType only has year-month-day granularity).

    Read the article

  • Error Connecting to Oracle Database from Pentaho Report Designer

    - by knt
    Hi all, I am new to Pentaho, and I am struggling to set up a new database connection. I am trying to connect to an Oracle 10g database, but whenever I test the connection, I get the below error. It doesn't really seem to list any specific error message so I'm not really sure what to do or where to go from this point. I placed ojdbc jar's in my tomcat lib folder, but maybe there is another place those should go. Any help/hints would be greatly appreciated. Error connecting to database [OFF SSP Cert] : org.pentaho.di.core.exception.KettleDatabaseException: Error occured while trying to connect to the database Error connecting to database: (using class oracle.jdbc.driver.OracleDriver) oracle/dms/instrument/ExecutionContextForJDBC org.pentaho.di.core.exception.KettleDatabaseException: Error occured while trying to connect to the database Error connecting to database: (using class oracle.jdbc.driver.OracleDriver) oracle/dms/instrument/ExecutionContextForJDBC org.pentaho.di.core.database.Database.normalConnect(Database.java:366) org.pentaho.di.core.database.Database.connect(Database.java:315) org.pentaho.di.core.database.Database.connect(Database.java:277) org.pentaho.di.core.database.Database.connect(Database.java:267) org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:76) org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2443) org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:510) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) java.lang.reflect.Method.invoke(Unknown Source) org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329) org.pentaho.ui.xul.swing.tags.SwingButton$OnClickRunnable.run(SwingButton.java:58) java.awt.event.InvocationEvent.dispatch(Unknown Source) java.awt.EventQueue.dispatchEvent(Unknown Source) java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source) java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source) java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source) java.awt.Dialog$1.run(Unknown Source) java.awt.Dialog$3.run(Unknown Source) java.security.AccessController.doPrivileged(Native Method) java.awt.Dialog.show(Unknown Source) java.awt.Component.show(Unknown Source) java.awt.Component.setVisible(Unknown Source) java.awt.Window.setVisible(Unknown Source) java.awt.Dialog.setVisible(Unknown Source) org.pentaho.ui.xul.swing.tags.SwingDialog.show(SwingDialog.java:234) org.pentaho.reporting.ui.datasources.jdbc.ui.XulDatabaseDialog.open(XulDatabaseDialog.java:237) org.pentaho.reporting.ui.datasources.jdbc.ui.ConnectionPanel$EditDataSourceAction.actionPerformed(ConnectionPanel.java:162) javax.swing.AbstractButton.fireActionPerformed(Unknown Source) javax.swing.AbstractButton$Handler.actionPerformed(Unknown Source) javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source) javax.swing.DefaultButtonModel.setPressed(Unknown Source) javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source) java.awt.AWTEventMulticaster.mouseReleased(Unknown Source) java.awt.AWTEventMulticaster.mouseReleased(Unknown Source) java.awt.Component.processMouseEvent(Unknown Source) javax.swing.JComponent.processMouseEvent(Unknown Source) java.awt.Component.processEvent(Unknown Source) java.awt.Container.processEvent(Unknown Source) java.awt.Component.dispatchEventImpl(Unknown Source) java.awt.Container.dispatchEventImpl(Unknown Source) java.awt.Component.dispatchEvent(Unknown Source) java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source) java.awt.LightweightDispatcher.processMouseEvent(Unknown Source) java.awt.LightweightDispatcher.dispatchEvent(Unknown Source) java.awt.Container.dispatchEventImpl(Unknown Source) java.awt.Window.dispatchEventImpl(Unknown Source) java.awt.Component.dispatchEvent(Unknown Source) java.awt.EventQueue.dispatchEvent(Unknown Source) java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source) java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source) java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source) java.awt.Dialog$1.run(Unknown Source) java.awt.Dialog$3.run(Unknown Source) java.security.AccessController.doPrivileged(Native Method) java.awt.Dialog.show(Unknown Source) java.awt.Component.show(Unknown Source) java.awt.Component.setVisible(Unknown Source) java.awt.Window.setVisible(Unknown Source) java.awt.Dialog.setVisible(Unknown Source) org.pentaho.reporting.ui.datasources.jdbc.ui.JdbcDataSourceDialog.performConfiguration(JdbcDataSourceDialog.java:661) org.pentaho.reporting.ui.datasources.jdbc.JdbcDataSourcePlugin.performEdit(JdbcDataSourcePlugin.java:67) org.pentaho.reporting.designer.core.actions.report.AddDataFactoryAction.actionPerformed(AddDataFactoryAction.java:79)

    Read the article

  • TXT File or Database?

    - by Ruth Rettigo
    Hey folks! What should I use in this case (Apache + PHP)? Database or just a TXT file? My priority #1 is speed. Operations Adding new items Reading items Max. 1 000 records Thank you. Database (MySQL) +----------+-----+ | Name | Age | +----------+-----+ | Joshua | 32 | | Thomas | 21 | | James | 34 | | Daniel | 12 | +----------+-----+ TXT file Joshua 32 Thomas 21 James 34 Daniel 12

    Read the article

  • Sync LINQ-to-SQL DBML schema with SQL Server database

    - by Maxim Z.
    After creating a SQL Server 2008 database, I made a Linq-to-SQL schema in Visual Studio. Next, in the .dbml visual editor (in Visual Studio), I added PK-to-FK and PK-to-PK associations to the schema. How do I copy those associations that I created in Visual Studio over to the database? In other words, how do I sync with the DB?

    Read the article

  • Database Replication

    - by tanthiamhuat
    I have tried to follow the steps outlined in http://www.howtoforge.com/mysql_database_replication for Database Replication. I have created the database exampledb, and create some tables and load them with values. But when I execute USE exampledb; FLUSH TABLES WITH READ LOCK; SHOW MASTER STATUS; I do not get any output, it says 0 rows affected. why is it so?

    Read the article

  • SQL Server high CPU and I/O activity database tuning

    - by zapping
    Our application tends to be running very slow recently. On debugging and tracing found out that the process is showing high cpu cycles and SQL Server shows high I/O activity. Can you please guide as to how it can be optimised? The application is now about an year old and the database file sizes are not very big or anything. The database is set to auto shrink. Its running on win2003, SQL Server 2005 and the application is a web application coded in c# i.e vs2005

    Read the article

  • How to configure database for exe.file

    - by newbie123
    Recently I had created a Java Desktop Application, I wish to share it with my friends. I was thinking to convert it into exe file using exe4j, but as for the database part I not sure how to configure it so that the application can run on my friends desktop without re-configure the database again. I am using Microsoft Access, anyone can guide me how to do it?

    Read the article

  • How to create item templates for SQL files, for Visual Studio Database Projects

    - by jonathanconway
    It's possible to define your own custom templates for normal project types, such as templates for C# files, for a Class Library project. What about a 'Database Project' scenario? I would like to define a standard template for adding a stored procedure, which uses the company's conventions for all stored procedures, such as standard comments at the top. Which folder contains Visual Studio templates for Database projects?

    Read the article

  • Database Schema Managing Framework/Library

    - by Karol Kolenda
    I'm looking for any database framework/library for .net which will act as a unified layer between an application and databases. Please note that I'm not interested in querying/updating data (there is plenty of DALs for that) but rather a framework which allows me to manage table schemas and indexes in a managed fashion (without using database specific SQL). I'm particularly interested in a library which supports Oracle, SQL Server and PostreSQL.

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >