Search Results

Search found 31120 results on 1245 pages for 'database connectivity'.

Page 104/1245 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Complex knowledge management system with CRM..written internally

    - by JonH
    We've all heard of salesforce and sugarcrm and the likes of systems like this. Unfortunately at my workplace we have been asked to write a similiar system (rather then license or purchase). Basically the database is fairly large. Think of modules such as: Corporate groups, customers, programs, projects, sub projects, and issue management. In simple terms a corporate group has one to many customers. A program has one or more projects. A project has one or more sub projects. And an issue can be created on many sub projects. Of course the system is a bit more complex but instead of listing every single module I think its best to keep it simple. In any event, the system in its current state has only two resources to be working on it (basically we have to do it all: CSS, database, jquery, asp.net and C#). We've started off well by defining the UI master and footer pages that way we can reuse those across all of our pages. Now comes the hard part. The system will have about 4k end users with say 5-10% being concurrent users. We are wondering if it makes sense to cache our database data (For say 5-10 minutes) rather then continously hit our database. The reason being is some of these pages may have 5-10 search filters associated with the page. Imagine every time a selection is made from a search box how many database hits. Also some of these search fields cascade so selecting for instance an initial drop down may cascade several drop down boxes under them. Is it wrong to cache because I am not finding too many articles on whether it is a good idea or not. Remember the system is similiar to say a CRM system where we manage our various customers, projects, sub projects, issues, etc.

    Read the article

  • What's the entry path towards a database administrator job?

    - by FarmBoy
    I've recently lost my job, and I'm working towards changing vocations. My degrees are in Mathematics, but I'm interested in IT, particularly working as a DBA or a programmer. I don't have IT experience, but I have the resourses to be patient with the transition, and I'm currently learning SQL and Java. Obviously, I need some job experience. My question is this: What entry-level jobs might allow me to gain useful experience towards obtaining a DBA job? It seems to me that programmers often start as testers, and system administrators could start at a help-desk position, but it is unclear how one begins to work with a company's database.

    Read the article

  • OVM Templates: Oracle Solaris Container with Oracle Database 11gR2

    - by Roman Ivanov
    I am delighted to inform you that Oracle just made available new Oracle Solaris Virtual Machine (VM) Templates: Oracle Solaris Container with Oracle Database 11gR2. This VM Templates available for SPARC and x86 platforms. Both Oracle VM Templates based on encapsulating an Oracle Solaris 10 Container which can then be attached to SPARC or x86 system running Oracle Solaris 10 10/09 or later. Make sure your select correct SPARC or x86 platform. The download includes Oracle Solaris 10 10/09 Container Oracle Database 11gR2 pre-installed in the Container.

    Read the article

  • Latest Security Inside Out Newsletter Now Available

    - by Troy Kitch
    The September/October edition of the Security Inside Out Newsletter is now available. Learn about Oracle OpenWorld database security sessions, hands on labs, and demos you'll want to attend, as well as frequently asked question about Label-Based Access Controls in Oracle Database 11g. Subscriber here for the bi-monthly newsletter.  ...and if you haven't already done so, join Oracle Database on these social networks: Twitter Facebook LinkedIn Google+ 

    Read the article

  • SQL SERVER Quick Note of Database Mirroring

    Just a day ago, I was invited at Round Table meeting at prestigious organization. They were planning to implement High Availability solution using Database Mirroring. During the meeting, I have made few notes of what was being discussed there. I just thought it would be interested for all of you know about it.Database Mirroring works [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Impact of Truncate or Drop Table When Flashback Database is Enabled

    - by alejandro.vargas
    Recently I was working on a VLDB on the implementation of a disaster recovery environment configured with data guard physical standby and fast start failover. One of the questions that come up was about the overhead of truncating and dropping tables. There are daily jobs on the database that truncate extremely large partitions, and as note 565535.1 explains, we knew there is an overhead for these operations. But the information on the note was not clear enough, we the additional information I've got from Senior Oracle colleagues I did compile this document "Impact of Truncate or Drop Table When Flashback Database is Enabled" that further explain the case

    Read the article

  • SQL Server Maintenence Plan error on an offline Database

    - by Sean Earp
    Today is SQL day for me :) I have a maintenance plan that is failing to run with the following error: Failed:(-1073548784) Executing the query "USE [SharedServices1_DB]" failed with the following error: "Database 'SharedServices1_DB' cannot be opened because it is offline.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. where SharedServices1_DB is a database that is set to offline. I would like to exclude this database from the maintenance plan, but when the database is offline, it does not show up at all as a "specific database" in the maintenance plan task, and if I bring it online, it is already unchecked in the maintenance plan task. How can I exclude an offline database from a maintenance plan?

    Read the article

  • Should a database server be in a different VM instance as an application?

    - by orokusaki
    I'm setting up a database server as a separate VM in my server so that I can control resources, and make backups of just that instance. I own a server that will reside in a colo soon. Is this the best way to approach my DB regarding scalability? Are there any security concerns? Do I listen at localhost still, even though it's a separate instance? And, is there any benefit to running your DB (PostgreSQL in my case) in the same machine as your application (web based SAAS application in my case)?

    Read the article

  • Improved Database Threat Management with Oracle Audit Vault and ArcSight ESM

    - by roxana.bradescu
    Data represents one of the most valuable assets in any organization, making databases the primary target of today's attacks. It is important that organizations adopt a database security defense-in-depth approach that includes data encryption and masking, access control for privileged users and applications, activity monitoring and auditing. With Oracle Audit Vault, organizations can reliably monitor database activity enterprise-wide and alert on any security policy exceptions. The new integration between Oracle Audit Vault and ArcSight Enterprise Security Manager, allows organizations to take advantage of enterprise-wide, real-time event aggregation, correlation and response to attacks against their databases. Join us for this live SANS Tool Talk event to learn more about this new joint solution and real-world attack scenarios that can now be quickly detected and thwarted.

    Read the article

  • SharePoint Content Database Sizing

    - by Sahil Malik
    SharePoint, WCF and Azure Trainings: more information SharePoint stores majority of its content in SQL Server databases. Many of these databases are concerned with the overall configuration of the system, or managed services support. However, a majority of these databases are those that accept uploaded content, or collaborative content. These databases need to be sized with various factors in mind, such as, Ability to backup/restore the content quickly, thereby allowing for quicker SLAs and isolation in event of database failure. SharePoint as a system avoids SQL transactions in many instances. It does so to avoid locks, but does so at the cost of resultant orphan data or possible data corruption. Larger databases are known to have more orphan items than smaller ones. Also smaller databases keep the problems isolated. As a result, it is very important for any project to estimate content database base sizing estimation. This is especially important in collaborative document centric projects. Not doing this upfront planning can Read full article ....

    Read the article

  • SQL SERVER – Script to Update a Specific Column in Entire Database

    - by Pinal Dave
    Last week, I have received a very interesting question and I find in email and I really liked the question as I had to play around with SQL Script for a while to come up with the answer he was looking for. Please read the question and I believe that all of us face this kind of situation. “Pinal, In our database we have recently introduced ModifiedDate column in all of the tables. Now onwards any update happens in the row, we are updating current date and time to that field. Now here is the issue, when we added that field we did not update it with a default value because we were not sure when we will go live with the system so we let it be NULL. Now modification to the application went live yesterday and we are now updating this field. Here is where I need your help. We need to update all the tables in our database where we have column created ModifiedDate and now want to update with current datetime. As our system is already live since yesterday there are several thousands of the rows which are already updated with real world value so we do not want to update those values. Essentially, in our entire database where ever there is a ModifiedDate column and if it is NULL we want to update that with current date time?  Do you have a script for it?” Honestly I did not have such a script. This is very specific required but I was able to come up with two different methods how he can use this method. Method 1 : Using INFORMATION_SCHEMA SELECT 'UPDATE ' + T.TABLE_SCHEMA + '.' + T.TABLE_NAME + ' SET ModifiedDate = GETDATE() WHERE ModifiedDate IS NULL;' FROM INFORMATION_SCHEMA.TABLES T INNER JOIN INFORMATION_SCHEMA.COLUMNS C ON T.TABLE_NAME = C.TABLE_NAME AND c.COLUMN_NAME ='ModifiedDate' WHERE T.TABLE_TYPE = 'BASE TABLE' ORDER BY T.TABLE_SCHEMA, T.TABLE_NAME; Method 2: Using DMV SELECT 'UPDATE ' + SCHEMA_NAME(t.schema_id) + '.' + t.name + ' SET ModifiedDate = GETDATE() WHERE ModifiedDate IS NULL;' FROM sys.tables AS t INNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_ID WHERE c.name ='ModifiedDate' ORDER BY SCHEMA_NAME(t.schema_id), t.name; Above scripts will create an UPDATE script which will do the task which is asked. We can pretty much the update script to any other SELECT statement and retrieve any other data as well. Click to Download Scripts Reference: Pinal Dave (http://blog.sqlauthority.com)  Filed under: PostADay, SQL, SQL Authority, SQL Joins, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Does tempdb Get Recreated From model at Startup?

    - by Jonathan Kehayias
    In my last post Does the tempdb Log file get Zero Initialized at Startup? I questioned whether or not tempdb is actually created from the model database or not at startup.  There is actually an easy way to prove that this statement, at least internally to the tempdb database is in fact TRUE.  Many thanks go out to Bob Ward (Blog | Twitter) for pointing this out after trading emails with him. To validate that tempdb is actually copied at startup from the model database, all that is necessary...(read more)

    Read the article

  • How to create an Access database by using ADOX and Visual C# .NET

    - by SAMIR BHOGAYTA
    Build an Access Database 1. Open a new Visual C# .NET console application. 2. In Solution Explorer, right-click the References node and select Add Reference. 3. On the COM tab, select Microsoft ADO Ext. 2.7 for DDL and Security, click Select to add it to the Selected Components, and then click OK. 4. Delete all of the code from the code window for Class1.cs. 5. Paste the following code into the code window: using System; using ADOX; private void btnCreate_Click(object sender, EventArgs e) { ADOX.CatalogClass cat = new ADOX.CatalogClass(); cat.Create("Provider=Microsoft.Jet.OLEDB.4.0;" +"Data Source=D:\\NewMDB.mdb;" +"Jet OLEDB:Engine Type=5"); MessageBox.Show("Database Created Successfully"); cat = null; }

    Read the article

  • Is there a "rigorous" method for choosing a database?

    - by Andrew Martin
    I'm not experienced with NoSQL, but one person on my team is calling for its use. I believe our data and its usage isn't optimal for a NoSQL implementation. However, my understanding is based off reading various threads on various websties. I'd like to get some stronger evidence as to who's correct. My question is therefore, "Is there a technique for estimating the performance and requirements of a certain database, that I could use to confirm or modify my intuitions?". Is there, for example, a good book for calculating the performance of equivalent MongoDB/MySQL schema? Is the only really reliable option to build the whole thing and take metrics?

    Read the article

  • Database checksum features - redundant? useful?

    - by Eloff
    Just about every mainstream DB has a feature to calculate checksums per page, per sector, or per record. Now for a DB that does full recover after any crash, like PostgreSQL, is a checksum even useful? There will be no data loss as long as the xlog is ok, no matter what kind of corruption happened to the data itself, as the redo log is replayed every committed transaction will be restored. So checksums are useless on restore. Doesn't the filesystem or disk keep checksums anyway to detect corruption? So unless the checksum is per record, all it does is tell you there is corruption - which the OS should be yelling at you the minute you try to read it - so useless in operation? I can't imagine how a checksum can be helpful in any sane database - but since they all use them - I'd say that's just failure of imagination on my part. So how is it useful?

    Read the article

  • Oracle Database In-Memory: Launch in Frankfurt

    - by Carsten Czarski
    Diesmal gibt es etwas Altes ... und etwas Neues. Zuerst das Neue: Am 11. Juni wird Larry Ellison in Redwood Shores die neue, bahnbrechende Oracle Database In-Memory Funktionalität vorstellen. Mit dieser neuen Technologie profitieren Kunden von beschleunigter Datenbankleistung für Analytics, Data Warehousing, Reporting und Online Transaction Processing (OLTP). Nur 6 Tage später - am 17. Juni -  findet, in Frankfurt, der einzige europäische Launch-Event statt. Neben Fachvorträgen, Panelveranstaltung und Demos wird ein Vortrag von Andy Mendelsohn, Head of Database Product Development, vorgesehen. Melden Sie sich heute noch an. Und hier ist das Alte: Wer erinnert sich noch die die HTML DB ...? In den Archiven der APEX Community Seite haben wir ein Video gefunden, welches zeigt, wie man Seiten in der HTML DB für andere Entwickler sperren konnte. Das gibt es heute übrigens auch noch - es sieht nur etwas anders aus. Viel Spaß beim Ansehen.

    Read the article

  • IDC report - Highlights from Oracle OpenWorld 2012: Oracle Database 12c and Oracle Exadata X3

    - by Javier Puerta
    In December 2012, IDC has published "Highlights from Oracle OpenWorld 2012: Oracle Database 12c and Oracle Exadata X3" IDC provides a concise description of the technical and business benefits of Exadata X3 and Oracle Database 12c (focusing on Pluggable Databases) IDC states:  “The announced technologies [X3 & 12c] enhance Oracle’s position as an innovator that continues to enhance the value delivered to customers”   You can download the full report here.  (Oracle has purchased electronic distribution rights to this research note. Electronic rights expire in June 2013.)

    Read the article

  • Partitioning tutorial - new features in Oracle Database 12c

    - by KLaker
    For data warehousing projects Oracle Partitioning really is a must-have feature because it delivers so many important benefits such as: Dramatically improves query performance and speeds up database maintenance operations Lowers costs by enabling a tiered storage approach that allows data to be stored on the most cost-effective storage for better resource utilisation Combined with Oracle Advanced Compression, it provides an automated approach to information lifecycle management using a simple, efficient, yet powerful way to manage data growth and reduce complexity and costs To help you get the most from partitioning we have released a new tutorial that covers the 12c new features. Topics include how to: Use Interval Reference Partitioning Perform Cascading TRUNCATE and EXCHANGE Operations Move Partitions Online Maintain Multiple Partitions Maintain Global Indexes Asynchronously Use Partial Indexes For more information about this tutorial follow this link to the Oracle Learning Library: http://apex.oracle.com/pls/apex/f?p=44785:24:0::NO:24:P24_CONTENT_ID,P24_PREV_PAGE:8408,2 where you can begin your tutorial right now! For more information about Oracle Partitioning visit our home page on OTN: http://www.oracle.com/technetwork/database/bi-datawarehousing/dbbi-tech-info-part-100980.html

    Read the article

  • Consuming JSON stream into AWS Database on the cheap

    - by wjl
    I'm working on a project that needs to consume a JSON stream (approximately 1MB / minute), and parse and insert objects into a database. Amazon's DynamoDB or SimpleDB seem like attractive options for this. Is there a web service that can run a very simple script to eat the data and put it in a database? I could use a worker on Heroku or Elastic Beanstalk, or even pure EC2, but I'd like to find a service that's much cheaper, due to the very low amount of bandwidth and CPU required. (Sorry for the crappy tags. I'm not even sure where to categorize this question.)

    Read the article

  • Connecting to an Amazon AWS database [closed]

    - by Adel
    so I'm a bit overwhelmed/bewildered by the whole concept of networking/remote-desktop , etc. The context is that - in my company I need to access a remote database. The standard way I use is to first connect using a VPN-Client( called Shrew Soft Access manager), then once that says: "network device configured tunnel enabled" I'm good to connect using windows "Remote Desktop Connection" . But now our company set up an Amazon AWS database, and I'm told I need to connect, and I ony need to use RDP. So I tried the standard windows one - but it doesn't work. On wikipedia , I looked up remote desktop sftware and downloaded one called VNC Viewer. but it doesn't work. Any advice/tips/comments appreciated EDIT: YAYA! I finally got a little more connected . I had to use my username as a fully qualified name: Computer: XYZ.XYZ.XYZ.XYZ USERNAME: XYZ.XYZ.XYZ.XYZ\aazzam

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >