Search Results

Search found 41147 results on 1646 pages for 'database security'.

Page 91/1646 | < Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >

  • Advantage Database Replication

    - by Jon
    I have a client that wants two sites to have the ability to sync databases so information at Site A can be synced with Site B so the two sites can look at the same data. I'm not even sure of the infrastructure required. Would a VPN required to connect the 2 databases or would an internet based database work ie/Site A to InternetDatabase and Site B to InternetDatabase. Each site copies data to it periodically and then the InternetDatabase syncs it and the Sites can then pull data down. My other thought was something like Dropbox. If Site A and Site B use a Dropbox account to sync the ADT files etc can the database at each site then sync with those ADT files? Thanks

    Read the article

  • Store data in Ruby on Rails without Database

    - by snowmaninthesun
    I have a few data values that I need to store on my rails app and wanted to know if there are any alternatives to creating a database table just to do this simple task. Background: I'm writing some analytics and dashboard tools for my ruby on rails app and i'm hoping to speed up the dashboard by caching results that will never change. Right now I pull all users for the last 30 days, and re arange them so I can see the number of new users per day. It works great but takes quite a long time, in reality I should only need to calculate the most recent day and just store the rest of the array somewhere else. Where is the best way to store this array? Creating a database table seems a bit overkill, and i'm not sure that global variables are the correct answer. Is there a best practice for persisting data like this? If anyone has done anything like this before let me know what you did and how it turned out.

    Read the article

  • How to store a 250mb database in an Offline Web App

    - by Couto
    Ok, maybe i'm not seeing the whole picture or something, but i kinda need a brainstorm. So the purpose is to make a webapp (HTML5, CSS, Javascript) that has to search on a 250mb database without any internet connection, so.. yes the database has to be on the client side. The hard part here is, this App has to work on an iPod or iPhone without internet connection. (An initial connection to download the App is ok), LocalStorage has a 5mb limit, couchDB would be great since they have an webapp easily accessed by Javascript (privacy concerns don't matter at this point), so i'm pretty much out of ideas.... Does anyone see an alternative, or solution for the purpose?

    Read the article

  • Selecting pictures from database

    - by user1691618
    Okay guys, I need some help. I'm trying to design a page where a user can ask a question and upload up to 4 pictures. I have 2 database tables, questions and image_table. I have everything uploading correctly, so that's not my problem. What I am having trouble doing is selecting images from the image_table database that only correspond to that question. I can't figure out how to do this. Any help would be GREATLY appreciated.

    Read the article

  • Iterating over a database column in Django

    - by curious
    I would like to iterate a calculation over a column of values in a MySQL database. I wondered if Django had any built-in functionality for doing this. Previously, I have just used the following to store each column as a list of tuples with the name table_column: import MySQLdb import sys try: conn = MySQLdb.connect (host = "localhost", user = "user", passwd="passwd", db="db") except MySQLdb.Error, e: print "Error %d: %s" % (e.args[0], e.args[1]) sys.exit (1) cursor = conn.cursor() for table in ['foo', 'bar']: for column in ['foobar1', 'foobar2']: cursor.execute('select %s from %s' % (column, table)) exec "%s_%s = cursor.fetchall()" % (table, column) cursor.close() conn.commit() conn.close() Is there any functionality built into Django to more conveniently iterate through the values of a column in a database table? I'm dealing with millions of rows so speed of execution is important.

    Read the article

  • Comparing textbox value to database

    - by simon
    HI ! I would like to compare values from a textbox with data from a table. I tried this code but i got the error that the input string was in the wrong format! code: string connectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=save.mdb"; try { database = new OleDbConnection(connectionString); database.Open(); string queryString = "SELECT zivila.naziv,users.user_name FROM (obroki_save " + " LEFT JOIN zivila ON zivila.ID=obroki_save.ID_zivila) " + " LEFT JOIN users ON users.ID=obroki_save.ID_uporabnika " + " WHERE users.ID='" +Convert.ToInt16(id.iDTextBox.Text)+"'"; loadDataGrid(queryString); } catch (Exception ex) { MessageBox.Show(ex.Message); return; }

    Read the article

  • Database design to hold multiple iteration measurements

    - by Valder
    Hi All. I am new to sqlite and SQL in general. I am keen to switch from flat-files to sqlite for holding some measurement information. I need a tip on how to better layout the database, since I have zero experience with this. I have a ~10000 unique statistic counters that are collected before and after each test iteration. Max number of iterations are 10, though it could be less. I was thinking the following: CREATE TABLE stat_names(stat_id, stat_name); CREATE TABLE stats_per_iteration(stat_id, before_iter_1, after_iter_1, before_iter_2, after_iter_2, ...); stat_names table would hold mapping of a full counter to a uniq stat_id. stats_per_iteration table would hold mesurement data 1 + 10 * 2 columns. stat_names.stat_id = stats_per_iteration.stat_id OR maybe I should have a separate table for each iteration? Which would results in 1 + 10 tables in database. Thanks!

    Read the article

  • problems with inserting data in database

    - by jannes braet
    $message=$_POST['answer']; $message=nl2br($message); $message=strip_tags($message, '<p><a><b><i><strong><em><code><sub><sup><img><ul><ol><li>'); $message = mysql_real_escape_string($message); $user=$_SESSION['SESS_MEMBER_ID']; $qry="INSERT INTO forum_rules (message,author,date) VALUES ($message,$user,'".date("Y-m-d H:i:s")."')"; $result=mysql_query($qry) or die(mysql_error()); if (!$result) { echo "error inserting data into database"; } else { ... } this codes always outputs error inserting data into database and i don't see what i'm doing wrong. i hav also tried to do it without the date part but that didn't work tho. can someone please tell me what i'm doing wrong here

    Read the article

  • Suggestions on including free database products to include in an application - SQL Server Express or

    - by superartsy
    I am working on an enterprise level product that is designed around SQL Server Express and specifically its features (views, concurrent users, stored procedures, CASE and IF statements). Though we don't use any advanced SQL Server features, the database size limit of 4GB in the Express edition may up being a limitation. A work-around is that customers can move to more full-featured versions of SQL Server. The problem is that SQL Server Express deployment is not easy, and the installer size is huge. This is a major drawback for someone looking to try our product. You don't want end-users to not buy a product because the download is huge. Does anyone have any recommendations of a database that has a smaller footprint but all the features of Express and which can be migrated to express?

    Read the article

  • What is the most "database independent" way of creating a variable length text field in a database

    - by Thibaut Colar
    I want to create a text field in the database, with no specific size (it will store text of length unknown in some case) - the particular text are serialized simple object (~ JSON) What is the most database independent way to do this : - a varchar with no size specified (don't think all db support this) - a 'text' field, this seems to be common, but I don't believe it's a standard - a blob or other object of that kind ? - a varchar of a a very large size (that's inefficient and wastes disk space probably) - Other ? I'm using JDBC, but I'd like to use something that is supported in most DB (oracle, mysql, postgresql, derby, HSQL, H2 etc...) Thanks.

    Read the article

  • Ruby on Rails: Best way to save search queries in a database

    - by Adam Templeton
    For a RoR app I'm helping develop, I need to save all search queries in a database so I can analyze them later. My plan right now is to create a Result model and table, and just save each search query's text in that table, along with a user's ID, the time, etc. However, the app has about 15,000 users, so I'm afraid the single table approach won't be super efficient when it comes time to parse that data. (The database is setup via MySQL, if that factors in at all.) Am I just being paranoid? Is there a Ruby gem that handles this sort of thing, or a better approach I could take? Any input would be appreciated.

    Read the article

  • Array of an array (Database)

    - by Anne Mah Li'en
    I am trying to print out an array of an array from database Below are my codes. I am able to retrieve all the values from the first array. But error occurs when I am trying to retrieve the 2nd array from database. <% ArrayList<Questionnaire> allCategories =QuestionnaireController.getQuestionnaireByCategoryAll(); for(int i=0;i<allCategories.size();i++){ Questionnaire allCategoriesQuestionnaire=allCategories.get(i); out.println("<div class=\"silverheader\">" + "<a href= \"\">" + allCategoriesQuestionnaire.getCategory() + "</a>" + "</div>" + "<div class=\"submenu\">" + "ArrayList<Questionnaire> CategoriesSustainability =QuestionnaireController.getQuestionnaireByCategorySustainability();" + out.println(CategoriesSustainability.get(0).getCategory()); + "<br />" + "</div>"); } %>

    Read the article

  • Proper code but can't insert to database

    - by Dchris
    I have a Visual Basic project using Access database.I run a query but i don't see any new data on my database table.I don't have any exception or error.Instead of this the success messagebox is shown. Here is my code: Dim ID As Integer = 2 Dim TableNumber As Integer = 2 Dim OrderDate As Date = Format(Now, "General Date") Dim TotalPrice As Double = 100.0 Dim ConnectionString As String = "myconnectionstring" Dim con As New OleDb.OleDbConnection(ConnectionString) Try Dim InsertCMD As OleDb.OleDbCommand InsertCMD = New OleDb.OleDbCommand("INSERT INTO Orders([ID],[TableNumber],[OrderDate],[TotalPrice]) VALUES(@ID,@TableNumber,@OrderDate,@TotalPrice);", con) InsertCMD.Parameters.AddWithValue("@ID", ID) InsertCMD.Parameters.AddWithValue("@TableNumber", TableNumber) InsertCMD.Parameters.AddWithValue("@OrderDate", OrderDate) InsertCMD.Parameters.AddWithValue("@TotalPrice", TotalPrice) con.Open() InsertCMD.ExecuteNonQuery() MessageBox.Show("Successfully Added New Order", "Success", MessageBoxButtons.OK, MessageBoxIcon.Information) con.Close() Catch ex As Exception 'Something went wrong MessageBox.Show(ex.ToString) Finally 'Success or not, make sure it's closed If con.State <> ConnectionState.Closed Then con.Close() End Try What is the problem?

    Read the article

  • sports league database design

    - by John
    Hello, I'm developing a database to store statistics for a sports league. I'd like to show several tables: - league table that indicates the position of the team in the current and previous fixture - table that shows the position of a team in every fixture in the championship I have a matches table: Matches (IdMatch, IdTeam1, IdTeam2, GoalsTeam1, GoalsTeam2) Whith this table I can calculate the total points of every team based on the matches the team played. But every time I want to show the league table I have to calculate the points. Also I have a problem to calculate in which position classified a team in the last 10 fixtures cause I have to make 10 queries. To store the league table for every fixture in a database table is another approach, but every time I change a match already played I have to recalculate every fixture from there... Is there a better approach for this problem? Thanks

    Read the article

  • Connecting to the database in a Created Library - CodeIgniter

    - by Brogrammer
    I have a little issue right now and its bugging me, hopefully one of you guys will be able to help me out. Basically I'm creating a library to use in CodeIgniter and I'm getting this error: A PHP Error was encountered Severity: Notice Message: Undefined property: Functions::$db Filename: libraries/Functions.php Line Number: 11 The Database library is already on autoload aswell as my functions library: $autoload['libraries'] = array('database','session','encrypt','functions'); The Functions.php file is located in the application/libraries folder accordingly. Line number 11 consists of this: $this->db->where('username', $data); Not sure as to why the db is an undefined property?

    Read the article

  • Ubuntu server security; Is this enough?

    - by Camran
    I have a classifieds website, which uses php5 and mysql, and also java (solr). I am new to linux and VPS... I have installed SSH, and I have installed IPtables, and also I have PuTTY which I use as a terminal. Also, Filezilla is installed on my computer, and whenever I connect to my VPS, the "host" field in Filezilla says "Sftp://ip-adress" so I am guessing it is a safe connection. I used this command to find out if I had SSHD installed: whereis sshd and it returned some places where it actually was installed. So I havent actually installed it. Now, my Q is, is this enough? What other security measures should I take? Any good articles about security and how to set it up on a VPS? Remember, I have a windows xp OS on my laptop, but the OS for my VPS is Ubuntu 9.10. Also, I have apache2 installed... Thanks

    Read the article

  • Database that accesses a website.?

    - by Alec
    Hi Guys What application should I use that is able to automatically access a website to gather information? Basically I have a database that completes calculations for me; however I have to manually gather the parameters from a website and input these into my database. What I would like is have an application that will take my input say the name of a product, access the website, add this name into a search box on a website, complete the search and then extract the desired information from the web page returning the results to my application to complete the calculation thus presenting me with the result. This is a little out of my depth but I’m willing to learn no matter how complicated the software. Cheers for your help.

    Read the article

  • Preventing SQL injecting in a database class

    - by Josh
    I'm building a database class and thought it'd be a good idea to incorporate some form of SQL injection prevention (duh!). Here's the method that runs a database query: class DB { var $db_host = 'localhost'; var $db_user = 'root'; var $db_passwd = ''; var $db_name = 'whatever'; function query($sql) { $this->result = mysql_query($sql, $this->link); if(!$this->result) { $this->error(mysql_error()); } else { return $this->result; } } } There's more in the class than that but I'm cutting it down just for this. The problem I'm facing is if I just use mysql_real_escape_string($sql, $this->link); then it escapes the entire query and leads to a SQL syntax error. How can I dynamically find the variables that need to be escaped? I want to avoid using mysql_real_escape_string() in my main code blocks, i'd rather have it in a function. Thanks.

    Read the article

  • Oracle IRM video demonstration of seperating duties of document security

    - by Simon Thorpe
    One thing an Information Rights Management technology should do well is separate out three main areas of responsibility.The business process of defining and controlling the classifications to which content is secured and the definition of the roles employees, customers, partners and contractors have when accessing secured content. Allow IT to manage the server and perform the role of authorizing the creation of new classifications to meet business needs but yet once the classification has been created and handed off to the business, IT no longer plays a role on the ongoing management. Empower the business to take ownership of classifications to which their own content is secured. For example an employee who is leading an acquisition project should be responsible for defining who has access to confidential project documents. This person should be able to manage the rights users have in the classification and also be the point of contact for those wishing to gain rights. Oracle IRM has since it's creation in the late 1990's had this core model at the heart of its design. Due in part to the important seperation of rights from the documents themselves, Oracle IRM places the right functionality within the right parts of the business. For example some IRM technologies allow the end user to make decisions about what users can print, edit or save a secured document. This in practice results in a wide variety of content secured with a plethora of options that don't conform to any policy. With Oracle IRM users choose from a list of classifications to which they have been given the ability to secure information against. Their role in the classification was given to them by the business owner of the classification, yet the definition of the role resides within the realm of corporate security who own the overall business classification policies. It is this type of design and philosophy in Oracle IRM that makes it an enterprise solution that works beyond a few users and a few secured documents to hundreds of thousands of users and millions of documents. This following video shows how Oracle IRM 11g, the market leading document security solution, lets the security organization manage and create classifications whilst the business owns and manages them. If you want to experience using Oracle IRM secured content and the effects of different roles users have, why not sign up for our free demonstration.

    Read the article

  • Database Insider Newsletter Helps Oracle Achieve Maggie Award Bid

    - by jenny.gelhausen
    The Database Insider team is honored to have our monthly newsletter help Oracle be nominated as a 2010 Maggie Award finalist. The Maggie Awards, known as the "Oscars" of the periodicals, are recognition of excellence to deserving individuals and companies whose work is deemed "The Best in the West" in a wide variety of publishing categories. The list of 2010 Maggie Award finalists is impressive and includes some past champions - so win or lose, the Database Insider team is thrilled to have helped Oracle achieve this finalist nomination in the category of Best Web E-Newsletter/Trade & Consumer. Thanks to all our faithful readers and subscribers. Haven't seen our newsletter yet? Read the latest Database Insider Newsletter edition. We invite you to subscribe and joins others receiving the Oracle Database Insider Newsletter in their Inbox, click here to register to start receiving your monthly Database Insider newsletter. Under Oracle Communications check the box next to: Oracle Database Insider - All about Oracle Database features and options including news and analysis, reviews, customer stories, events, offers, and more. Monthly. See sample. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • SQL Azure Security: DoS

    - by Herve Roggero
    Since I decided to understand in more depth how SQL Azure works I started to dig into its performance characteristics. So I decided to write an application that allows me to put SQL Azure to the test and compare results with a local SQL Server database. One of the options I added is the ability to issue the same command on multiple threads to get certain performance metrics. That's when I stumbled on an interesting security feature of SQL Azure: its Denial of Service (DoS) detection engine. What this security feature does is that it performs a check on the number of connections being established, and if the rate of connection is too high, SQL Azure blocks all communication from that machine. I am still trying to learn more about this specific feature, but it appears that going to the SQL Azure portal and testing the connection from the portal "resets" the feature and you are allowed to connect again... until you reach the login threashold. In the specific test I was performing, all the logins were successful. I haven't tried to login with an invalid account or password... that will be for next time. On my Linked In group (SQL Server and SQL Azure Security: http://www.linkedin.com/groups?gid=2569994&trk=hb_side_g) Chip Andrews (www.sqlsecurity.com) pointed out that this feature in itself could present an internal threat. In theory, a rogue application could be issuing many login requests from a NATed network, which could potentially prevent any production system from connecting to SQL Azure within the same network. My initial response was that this could indeed be the case. However, while the TCP protocol contains the latest NATed IP address of a machine (which masks the origin of the machine making the SQL request), the TDS protocol itself contains the IP Address of the machine making the initial request; so technically there would be a way for SQL Azure to block only the internal IP address making the rogue requests.  So this warrants further investigation... stay tuned...

    Read the article

  • Issue 55 - Skin Object Tokens, Optimized Control Panel, OWS Validation and Security, RAD

    April 2010 Welcome to Issue 55 of DNN Creative Magazine In this issue we focus on the new Skin Object token method introduced in DotNetNuke 5 for adding tokens into a DotNetNuke skin. A Skin Object Token is a web user control which covers skin elements such as the logo, menu, search, login links, date, copyright, languages, links, banners, privacy, terms of use, etc. Following this we demonstrate how to install and use two Advanced DotNetNuke Admin Control Panels which are available for free from Oliver Hine. These control panels provide an optimized version of the admin control panel to improve performance and page load times, as well as a ribbon bar control panel which adds additional features. Next, we continue the Open Web Studio tutorials, this month we demonstrate some very advanced techniques for building a car parts application in Open Web Studio. Throughout the tutorial we cover form input, validation, how to use dependant drop down lists, populating checkbox lists and introduce a new concept of data level security. Data level security allows you to control which data a user can access within a module. To finish, we have part five of the "How to Build a News Application with DotNetMushroom Rapid Application Developer (RAD)" article, where we demonstrate how to implement paging. This issue comes complete with 14 videos. Skinning: Skin Object Tokens for DotNetNuke 5 (8 videos - 64mins) Free Module: Advanced Optimized Control Panel by Oliver Hine (1 video - 11mins) Module Development Series: Form Validation, Dependant Drop Downs and Data Level Security in OWS (5 videos - 44mins) How to Implement Paging with DotNetMushroom RAD View issue 55 to download all of the videos in one zip file DNN Creative Magazine for DotNetNuke Web Designers Covering DotNetNuke module video reviews, video tutorials, mp3 interviews, resources and web design tips for working with DotNetNuke. In 55 issues we have created 563 videos!Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • What's New in Database Lifecycle Management in Enterprise Manager 12c Release 3

    - by HariSrinivasan
    Enterprise Manager 12c Release 3 includes improvements and enhancements across every area of the product. This blog provides an overview of the new and enhanced features in the Database Lifecycle Management area. I will deep dive into specific features more in depth in subsequent posts. "What's New?"  In this release, we focused on four things: 1. Lifecycle Management Support for new Database12c - Pluggable Databases 2. Management of long running processes, such as a security patch cycle (Change Activity Planner) 3. Management of large number of systems by · Leveraging new framework capabilities for lifecycle operations, such as the new advanced ‘emcli’ script option · Refining features such as configuration search and compliance 4. Minor improvements and quality fixes to existing features · Rollback support for Single instance databases · Improved "OFFLINE" Patching experience · Faster collection of ORACLE_HOME configurations Lifecycle Management Support for new Database 12c - Pluggable Databases Database 12c introduces Pluggable Databases (PDBs), the brand new addition to help you achieve your consolidation goals. Pluggable databases offer unprecedented consolidation at database level and native lifecycle verbs for creating, plugging and unplugging the databases on a container database (CDB). Enterprise Manager can supplement the capabilities of pluggable databases by offering workflows for migrating, provisioning and cloning them using the software library and the deployment procedures. For example, Enterprise Manager can migrate an existing database to a PDB or clone a PDB by storing a versioned copy in the software library. One can also manage the planned downtime related to patching by  migrating the PDBs to a new CDB. While pluggable databases offer these exciting features, it can also pose configuration management and compliance challenges if not managed properly. Enterprise Manager features like inventory management, topology associations and configuration search can mitigate the sprawl of PDBs and also lock them to predefined golden standards using configuration comparison and compliance rules. Learn More ... Management of Long Running datacenter processes - Change Activity Planner (CAP) Currently, customers resort to cumbersome methods to create, execute, track and monitor change activities within their data center. Some customers use traditional tools such as spreadsheets, project planners and in-house custom built solutions. Customers often have weekly sync up meetings across stake holders to collect status and updates. Some of the change activities, for example the quarterly patch set update (PSU) patch rollouts are not single tasks but processes with multiple tasks. Some of those tasks are performed within Enterprise Manager Cloud Control (for example Patch) and some are performed outside of Enterprise Manager Cloud Control. These tasks often run for a longer period of time and involve multiple people or teams. Enterprise Manger Cloud Control supports core data center operations such as configuration management, compliance management, and automation. Enterprise Manager Cloud Control release 12.1.0.3 leverages these capabilities and introduces the Change Activity Planner (CAP). CAP provides the ability to plan, execute, and track change activities in real time. It covers the typical datacenter activities that are spread over a long period of time, across multiple people and multiple targets (even target types). Here are some examples of Change Activity Process in a datacenter: · Patching large environments (PSU/CPU Patching cycles) · Upgrading large number of database environments · Rolling out Compliance Rules · Database Consolidation to Exadata environments CAP provides user flows for Compliance Officers/Managers (incl. lead administrators) and Operators (DBAs and admins). Managers can create change activity plans for various projects, allocate resources, targets, and groups affected. Upon activation of the plan, tasks are created and automatically assigned to individual administrators based on target ownership. Administrators (DBAs) can identify their tasks and understand the context, schedules, and priorities. They can complete tasks using Enterprise Manager Cloud Control automation features such as patch plans (or in some cases outside Enterprise Manager). Upon completion, compliance is evaluated for validations and updates the status of the tasks and the plans. Learn More about CAP ...  Improved Configuration & Compliance Management of a large number of systems Improved Configuration Comparison:  Get to the configuration comparison results faster for simple ad-hoc comparisons. When performing a 1 to 1 comparison, Enterprise Manager will perform the comparison immediately and take the user directly to the results without having to wait for a job to be submitted and executed. Flattened system comparisons reduce comparison setup time and reduce complexity. In addition to the previously existing topological comparison, users now have an option to compare using a “flattened” methodology. Flattening means to remove duplicate target instances within the systems and remove the hierarchy of member targets. The result are much easier to spot differences particularly for specific use cases like comparing patch levels between complex systems like RAC and Fusion Apps. Improved Configuration Search & Advanced EMCLI Script option for Mass Automation Enterprise manager 12c introduces a new framework level capability to be able to script and stitch together multiple tasks using EMCLI. This powerful capability can be leveraged for lifecycle operations, especially when executing a task over a large number of targets. Specific usages of this include, retrieving a qualified list of targets using Configuration Search and then using the resultset for automation. Another example would be executing a patching operation and then re-executing on targets where it may have failed. This is complemented by other enhancements, such as a better usability for designing reusable configuration searches. IN EM 12c Rel 3, a simplified UI makes building adhoc searches even easier. Searching for missing patches is a common use of configuration search. This required the use of the advanced options which are now clearly defined and easy to use. Perform “Configuration Search” using the EMCLI. Users can find and execute Configuration Searches from the EMCLI which can be extremely useful for building sophisticated automation scripts. For an example, Run the Search named “Oracle Databases on Exadata” which finds all Database targets running on top of Exadata. Further filter the results by refining by options like name, host, etc.. emcli get_targets -config_search="Databases on Exadata" –target_name="exa%“ Use this in powerful mass automation operations using the new emcli script option. For example, to solve the use case of – Finding all DBs running on Exadata and housing E-Biz and Patch them. Create a Python script with emcli functions and invoke it in the new EMCLI script option shell. Invoke the script in the new EMCLI with script option directly: $<path to emcli>/emcli @myPSU_Patch.py Richer compliance content:  Now over 50 Oracle Provided Compliance Standards including new standards for Pluggable Database, Fusion Applications, Oracle Identity Manager, Oracle VM and Internet Directory. 9 Oracle provided Real Time Monitoring Standards containing over 900 Compliance Rules across 500 Facets. These new Real time Compliance Standards covers both Exadata Compute nodes and Linux servers. The result is increased Oracle software coverage and faster time to compliance monitoring on Exadata. Enhancements to Patch Management: Overhauled "OFFLINE" Patching experience: Simplified Patch uploads UI to improve the offline experience of patching. There is now a single step process to get the patches into software library. Customers often maintain local repositories of patches, sometimes called software depots, where they host the patches downloaded from My Oracle Support. In the past, you had to move these patches to your desktop then upload them to the Enterprise Manager's Software library through the Enterprise Manager Cloud Control user interface. You can now use the following EMCLI command to upload multiple patches directly from a remote location within the data center: $emcli upload_patches -location <Path to Patch directory> -from_host <HOSTNAME> The upload process filters all of the new patches, automatically selects the relevant metadata files from the location, and uploads the patches to software library. Other Improvements:  Patch rollback for single instance databases, new option in the Patch Plan to rollback the patches added to the patch plans. Upon execution, the procedure would rollback the patch and the SQL applied to the single instance Databases. Improved and faster configuration collection of Oracle Home targets can enable more reliable automation at higher level functions like Provisioning, Patching or Database as a Service. Just to recap, here is a list of database lifecycle management features:  * Red highlights mark – New or Enhanced in the Release 3. • Discovery, inventory tracking and reporting • Database provisioning including o Migration to Pluggable databases o Plugging and unplugging of pluggable databases o Gold image based cloning o Scaling of RAC nodes •Schema and data change management •End-to-end patch management in online and offline modes, including o Patch advisories in online (connected with My Oracle Support) and offline mode o Patch pre-deployment analysis, deployment and rollback (currently only for single instance databases) o Reporting • Upgrade planning and execution of the upgrade process • Configuration management including • Compliance management with out-of-box content • Change Activity Planner for planning, designing and tracking long running processes For more information on Enterprise Manager’s database lifecycle management capabilities, visit http://www.oracle.com/technetwork/oem/lifecycle-mgmt/index.html

    Read the article

  • Delivering Oracle DBaaS: Journey to Enterprise Cloud with Oracle Database 12c

    - by B R Clouse
    The release of Oracle Database 12c  is accompanied by extensive supporting collateral that details the new features and options of this major release.  But with so much to read and investigate, where to start?  If you don't have the time to pore through everything, then you may wish organize your reading in terms of the use case you're most interested in.  If your interest is Database as a Service in private database clouds then may I suggest that you start here: Accelerate the Journey to Enterprise Cloud with Oracle Database 12c Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} This paper describes the phases of the journey to enterprise cloud, and enumerates the new features and options in Oracle Database 12c that support each phase.  Oracle Multitenant figures prominently, but it's not the only cloud-enabling topic: Oracle Database Quality of Service Management, Application Continuity, Automatic Data Optimization, Global Data Services and Active Data Guard Far Sync all deliver key benefits for delivering database as a service.  Further reading and research is suggested by the references included in the paper. Happy clouding!

    Read the article

  • How To Backup Of MySQL Database Using PhpMyAdmin

    - by Jyoti
    It is very important to do backup of your MySql database, you will probably realize it when it is too late. A lot of web applications use MySql for storing the content. This can be blogs, and a lot of other things. When you have all your content as html files on your web server it is very easy to keep them safe from crashes, you just have a copy of them on your own PC and then upload them again after the web server is restored after the crash. All the content in the MySql database must also be backed up. If you have spent a lot of time making the content and it is only stored in the Mysql server, you will feel very bad if it gets lost for ever. Backing it up once every month or so makes sure you never loose too much of your work in case of a server crash, and it will make you sleep better at night. It is easy and fast, so there is no reason for not doing it. Step 1: Log into phpMyAdmin on your server. Step2: You can select the database that you would like to backup from the drop-down menu called Database. Step 3: A new page will be loaded in phpMyAdmin showing the selected database. In order to proceed with the backup click on the Export tab. Step 4: The options that you should select apart from the default ones are Save as file which will save the file locally to your computer in an .sql format and Add DROP TABLE which will add the drop table functionality if the table already exists in the database backup as shown below. Step 5: Click on the Go button to start the export/backup procedure for your database. A download window will pop up prompting for the exact place where you would like to save the file on your local computer. It is possible that the download starts automatically. This depends on your browser’s settings.

    Read the article

< Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >