Search Results

Search found 109429 results on 4378 pages for 'content management server'.

Page 6/4378 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • WebLogic Server 12c????????????????????????????????????/???????WebLogic Server 12c Forum 2012?????

    - by ???02
    2012?1??????WebLogic Server 12c???200??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????2012?8????????WebLogic Server 12c Forum 2012????????·??????????????????????????/?????????????????(???) ????????????WebLogic Server 12c????????? ??WebLogic Server???????·???????WebLogic Server?????????????????????????????????????WebLogic Server 12c Forum 2012?????????????????·????????????????????????????????????????????????????·?????????????WebLogic Server 12c??????????????/????????????? ???????????????????????????? ????????????????????????????????????? ?????? ??????????????????????????????????? ??????????????????????????????????????????????????????????????????????????“??????????????????????????”????????????????????????WebLogic Server 11g?????????????????? (1)????????????????????! = ??ID????????? WebLogic Server??????????????????????????????????????????????????????????????????????? ????????????????? ?????????·??????????????????? ??????init.d???????????? ???????????????ID????(boot.properties)??????????????????????WebLogic Server?????????????????????????????????? ???ID????????? <DOMAIN_HOME>/servers//security???????????·?????boot.properties????????????2??????? username= password= ???????????????????WebLogic Server?????????????????????????????????????????????????????????????????? (2)WLST???????????????????! = WLST????????? WebLogic Server???????????????????????????????????????????????????????????????/?????????????WLST(WebLogic Scripting Tool)????????????????????????????????????????????????????????????????????/????????????WLST?????????? ?????????????WLST???????????????????????Python?Java?????Jython?????????????WebLogic Server?????????????MBean???????????????????????????????????WLST?????????????Excel????????????????????????????????????????????????????????????????????????????WLST?????????????????????????????????? ?WLST???????????? ????????[????·????]-[???????]?????????????????[??]???????? ???????????????????????????[?????]????????py?????WLST???????????????????????????[??]??????????????????? ???????????????WLST?????????????????????WLST???????????????????????????????????????????????????????????????????? (3)????SQL?WebLogic Server???????! = ????·????? ??????????WebLogic Server??????????????????????????????SQL??????????????????????????????????·????????????????????????????????????????·???????????WebLogic Server????????·???????·???????????????WebLogic Server????/?????????????????????????????????????JDBC????????????????????????????SQL?ResultSet?????????????·????????? ????·????????????? ?????·??????? ????????[??]-[????]-[?????]-?????????????WebLogic Server?????/?????????????????????????????????????????[weblogic]-[jdbc]-[sql]-[DebugJDBCSQL]?????????[???]???????? ??????JDBC?????·???????·????????? ????/???????????????????????/???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? (4)[??]??????????????! = [??????????]????? ?[??]????????????????????????????????????????????????????????????????????????????[??]?????????????????????????????????2??????????????????????????????????????????????????????????????????????????????????????????????????????[??????????]?????????2?????????????????????????? ?????????2????????????? ????????[??????]-[???]-[???????????????]??????[???·???????]?????[??????????]?????????[??]???????? ??????????????????????????? (5)????????????????????????????! = ?V$SESSION.PROGRAM?????????????? ???Oracle Database?????????????????????????? WebLogic Server?????????????????????????????????????????????????????????????????????????????1??WebLogic Server????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ???????????????V$SESSION.PROGRAM?????????????Oracle Database?????????·?????????????????????????????????V$SESSION????????????????PROGRAM????????????????????????????WebLogic Server???????????????????????????JDBC Thin Client???JDBC????????????????????????????????????????PROGRAM???????????????????????·?????????????????????? V$SESSION.PROGRAM???????????????????????????? ?V$SESSION.PROGRAM??????????????? ???????????([???JDBC?????????]??)?[?????]?????1??????? v$session.program= (6)???·???????·????????! = ????·????·????????? WebLogic Server?????????????????????????????????(???·??)????????????????·???????1??????????????????????????????????????????WebLogic Server?????????????????????????????????? ??????????????????·????·?????????????????????????WebLogic Server?????????????????????????????????????? ???????????JDK???????Log4j????????????????WebLogic Server???????????????? WebLogic Server????????JDK?Log4j??????????????????JDK?????????? ????·????·???????????????? ?????·????·????????? JDK???Log4j?????????????????????/?????????????? ?JDK???:weblogic.logging.ServerLoggingHandler ?Log4j???:weblogic.logging.log4j.ServerLoggingAppender ??????Log4j?????????????(????)? <appender name="file" class="org.apache.log4j.FileAppender"> <param name="File" value=“applog.log" /> <param name="Append" value="true" /> <layout class="org.apache.log4j.PatternLayout"> <param name="ConversionPattern" value="%d %5p %c{1} - %m%n" /> </layout> </appender> <appender name="server" class="weblogic.logging.log4j.ServerLoggingAppender"/> ?????????????????????????????????????????WebLogic Server?????????????????????????????????????? ?JDK???:-Djava.util.loggin.config.file=<PATH> ?Log4j???:-Dlog4j.configuration=<PATH> ???????????WAR??????????????????????·????·??????????WebLogic Server??????????????????????????????????????WebLogic Server???????????????????? (7)BEA??????????????????????! = GetMessage??????? WebLogic Server????????·??????????????(?BEA-XXXXXX??????????????????BEA?????)????WebLogic Server?HTML??????????????????Web????????????????????????????????????????????? ????????????GetMessage?????????????BEA?????????????????????????????????????????? ?GetMessage????????? java?????WebLogic Server?GetMessage???????????? java weblogic.GetMessage ?????????? ?-id XXXX:???????????? ?-verbose:????????? ?-lang XX:???????(?????????????????????????????????????) ????BEA-000337?????????????????????????????? java weblogic.GetMessage -verbose -id 000337 -lang ja 000337: ?????"{2}"?????{0}?"{1}"??????????????????????????(StuckThreadMaxTime)"{3}"?????????????·????: {4} L10n?????: weblogic.kernel I18n?????: weblogic.kernel ??????: WebLogicServer ???: ??? ????·????: false ????????: ???????????????????????????????? ??: ???????????????????????????????????? ?????: ??????????????????????????????????????????????????????????????·???(weblogic.Admin THREAD_DUMP)????????? ?????????BEA?????????????????????????????????????????????????????????????????????????????????? ????????“WebLogic??”!? WebLogic Server 12c????/??? ????????????? ???????????? ????????????????? ?????????????? ???????????? ????????????WebLogic Server???????????????????????????????????????????????WebLogic Server 12c?????????????????????????????????????????????????????????WebLogic Server 11g(10.3.6)???????????? ?????????Mac OS X???(???????)?IE 6?7??????????????? ??OTN??????????????????System Requirements and Supported Platforms for WebLogic Server 12c (12.1.1.x)?????????????Mac OS?????????????2012?8?2?????Mac OS X Snow Leopard????Java SE 7?????????? ???Internet Explorer(IE) 6.x??7.x???????????????????IE????????IE 8.x???????????? ????Web?????WebLogic Server?????????Web???·??????64?????????????(WebLogic Server??????????64?????)?????????WebLogic Server????????????????????????????????????????? ?RESTful????????? Java EE 6?JAX-RS????????????WebLogic Server 12c???????????????????????????????????? ?RESTful??????????? [????]-[??]-[??]?[RESTful??????????]?????????WebLogic Server??????????????????URL?????????????????????????? http(s)://:/management/tenant-monitoring/servers ???????servers??????clusters???applications???datasources??????????????????????????????????????????????? ????HTML??????JSON?XML???????????????????????????????????????????????????????????????????? ?JDBC????????(1):??????·???? ???????????????????????WebLogic Server??????????????????????????????????12c??Active GridLink for RAC???????????????????????????????????? 1??????????·?????????????????????[????]?????????[??????]????????????????????????????????????????????????????????????????????? <DOMAIN_HOME>/servers//logs/datasource.log ?????????????????????????????????????? ??????????????????????WLDF????????????????WebLogic Server 12c?????????????????????? ?JDBC????????(2):?????????? ????????????????[??]-[?????]???[????]?????????????????????????????????????????????????????????????????WebLogic Server 12c?????????????????????????????????????? ??????????????????????????????????WebLogic Server 11g(10.3.1)???1??????????????????????????????????????? ?JDBC????????(3):????????? WebLogic Server 12c??[??]-[?????]????????????????????[?????????]???1???????????????????????????????????????????????????????????????????????????????????????????????????????????????Oracle Database?????????????????????????? ErrorCode ????? 3113 ???????end-of-file???????? 3114 Oracle?????????? 1033 Oracle??????????????? 1034 Oracle??????? 1089 ?????????? - ?????????? 1090 ???????? - ?????????? 17002 I/O?? ?????????(1):?????/?????????????????? ??????????2?????? 1???WebLogic Server????????????????????????????????????????? -Dweblogic.management.username= -Dweblogic.management.password= ????????????ID???????????????? ?????????(2):SSL????Certicom?????? SSL?????Certicom(?????10.3.3?????)???????JSSE(Java Secure Socket Extension)????????????????????????????????????????? ??????????????????????????????? ????????????WebLogic Server?????????????????????????????????Java?????????·????2??????????Java????????????????nodemanager.properties?????????????StartScriptEnabled?????????false??true?????? ?????????????false?????????????WebLogic Server??????????????????true??????????????startWebLogic???????????????WebLogic Server??????????????startWebLogic???????????????????????????????????????????????????????????????? ???/???????(1):setDomainEnv???????????????? ??/?????????????“?????·???”????????????????????????????2???? 1??setDomainEnv?????????????????????????????32???JDK????????Perm????????????? ?WebLogic Server 11g(10.3.5) MEM_PERM_SIZE_64BIT="-XX:PermSize=128m" MEM_PERM_SIZE_32BIT="-XX:PermSize=48m" ... MEM_MAX_PERM_SIZE_64BIT="-XX:MaxPermSize=256m" MEM_MAX_PERM_SIZE_32BIT="-XX:MaxPermSize=128m" ?WebLogic Server 12c(12.1.1) MEM_PERM_SIZE_64BIT="-XX:PermSize=128m" MEM_PERM_SIZE_32BIT="-XX:PermSize=128m" ... MEM_MAX_PERM_SIZE_64BIT="-XX:MaxPermSize=256m" MEM_MAX_PERM_SIZE_32BIT="-XX:MaxPermSize=256m" ???/???????(2):stopWebLogic??????shutdown.py????????? ??/???????????2????stopWebLogic??????????????shutdown.py??????????????????????????????ID/????????????????????????shutdown.py??????????????WebLogic Server 11g(10.3.3)?????????????ID????????????????????WebLogic Server 12c???????? ????????????? ?????????????/??? ???????????/?????????????? ???KernelMBean????UseConcurrentQueueForRequestManager?????????????????????????????[????]--[??????]-[??]?[?????·???????????????]??????????????????????????????????????????MBean???????????????????????????????????????????????????????????????????????????????????????????(???)? ??1???Windows?????????beasvc.exe????wlsvc.exe????????????????????????wlsvc _???????BEA????????????????(???)? ????WLST???????Jython???????????????????2.1??2.2.1????????(???11g??2.2.1???????)???????????????……?(???)? ???????WebLogic Server 12c???????????????????????/????????????????????????????????????????WebLogic Server?????????WebLogic Server????????????Facebook?????????WebLogic!??????????????

    Read the article

  • SCOM, Server 2008 and SQL Server 2008

    - by Jacques
    Hi there, I'm trying to setup SCOM(System Center Operations Manager 2007 (SCOM) – Platform Monitoring) on my Server 2008 machine using SQL Server 2008 running on the same machine. When I check my prerequisites I get problem on SQL and Active Directory components. (I'm running SQL server 2008 and Server 2008 with active directory not installed) Errors: 1.Microsoft SQL Server 2005 Service Pack 1 required. Details: SQL Server 2005 SP1 is the next version of SQL Server. SQL Server 2005 Enterprise Edition, is a complete data and analysis platform for large mission-critical business applications. The link provided in the resolution column is a trial version of the product and is not supported by the Microsoft SQL Server team In order to install active directory needs to be present. Details:Setup failed to verify the presence of Active Directory for this server. I've got a couple of questions I need answering, hope someone can help. Do I need to install Active Directory for SCOM to work? Can I run SCOM with an SQL 2008 Database? How do I get pass these problems?

    Read the article

  • Web server connection to SQL Server: Response Packet [Malformed Packet]

    - by John Murdoch
    I am seeing very, very sluggish performance between my web server (which handles HTTP web services connections) and a separate server running Microsoft SQL Server 2008. I have been capturing packet traffic on the web server trying to understand why things are running so slowly. I am using Wireshark to capture the packet traffic. The apparent problem is that the web server is sending TDS packets to the data server--each packet followed by a response from the data server with Response Packet [Malformed Packet] in the Info field. The packet sent from the web server appears to have an invalid checksum. Has anyone seen this type of problem before? Any ideas?

    Read the article

  • Can't restore backup from SQL Server 2008 R2 to SQL Server 2005 or 2008

    - by Erick
    Hi everyone, I'm trying to get a backup from SQL Server 2008 R2 restored to SQL Server 2008, but when we try to do the restore we get this: The database was backed up on a server running version 10.50.1092. That version is incompatible with this server, which is running version 10.00.2531. Either restore the database on a server that supports the backup, or use a backup that is compatible with this server. I can use the script wizard to generate a script, but that takes over an hour to run. I also tried just exporting the data from server to server, but it had issues with the primary keys/identity columns. I will be running into this issue with several other clients so any help you could offer about how to get around this would be great. Thanks for your help!

    Read the article

  • SQL Server 2008: Can't connect to remote server via management studio but can telnet in fine

    - by WarpKid
    Hi, I am in the process of trying to configure SQL Server 2008 to accept remote connections. I have been through all the documentation I can find and yet when I attempt to connect through management studio I get an error stating that the server could not be found. Interestingly I can connect through telnet to the remote server via the port that sql server is listening on. In the SQL Server logs I can see the connection attempt. So SQL Server is up and running and listening on the correct port - no firewall blocking it. It would appear that by default SQL Server is listening on port 50314 by default but management studio attempts to connect on port 1433.Weird. Server Management Studio = no dice. Anyone got any ideas? Server is set to allow remote connections - TCP IP is enabled, firewall is off. Thanks UPDATE FOR TO CLEAR THINGS UP A BIT We are seeing the connection attempt when we telnet in on port 50314 in the sql server logs. When we login through management studio we see it attempting connection on port 1433. There is no sign of this connection attempt in the logs.

    Read the article

  • SQL Server 2008 R2 upgrade fails on upgrade rule check

    - by Tim
    I'm trying to upgrade an evaluation instance of SQL Server 2008 to a fully licensed instance of SQL Server 2008 R2. I made it most of the way through the installer, but I'm getting stopped at the Upgrade Rules page - the SQL Server Analysis Services Upgrade Service Functional Check is failing. The specific error I get: Rule "SQL Server Analysis Services Upgrade Service Functional Check" failed. The current instance of the SQL Server Analysis Services service cannot be upgraded because the Analysis Services service is disabled or not online. Please start the service and then run the upgrade rules check again. Simple enough - just need to start the service. Here's where it gets troublesome. When I open Services and go to start the SQL Server Analysis Services (MSSQLSERVER) service, it provides me the following message: The SQL Server Analysis Services (MSSQLSERVER) service on Local Computer started and then stopped. Some services stop automatically if they are not in use by other services or programs. Trying from the command line as Administrator yields: PS C:\Windows\System32 net start MSSQLServerOLAPService The SQL Server Analysis Services (MSSQLSERVER) service is starting... The SQL Server Analysis Services (MSSQLSERVER) service could not be started. The service did not report an error. More help is available by typing NET HELPMSG 3534. I've tried changing the logon setting of this service to Administrator, a user with admin privileges, and both the Local System and Network Service accounts - nothing works. In addition, when I look at the service through the SQL Server Configuration Manager (also run as Administrator), attempting to change the logon setting for the service results in the message: The server threw an exception. [0x80010105] I have no need for analysis services themselves - all I need is for this one service to be running long enough to do the R2 upgrade, then it can shut down again. Any thoughts on how to get the Analysis Services service running? Update: Checking the event log, I found an error logged to the Application log from the MSSQLServerOLAPService. It has event ID 0, task category (289), and says: The service cannot be started: XML parsing failed at line 1, column 4: Unrecognized input signature.

    Read the article

  • Authenticating Linked Servers - SQL Server 8 to SQL Server 10

    - by jp2code
    We have an old SQL Server 2000 database that has to be kept because it is needed on our manufacturing machines. It also maintains our employee records, since they are needed on these machines for employee logins. We also have a newer SQL Server 10 database (I think this is 2008, but I'm not sure) that we are using for newer development. I have recently learned (i.e. today) that I can link the two servers. This would allow me to access the employee tables in the newer server. Following the SF post SQL Server to SQL Server Linked Server Setup, I tried adding the link. In our SQL Server 2000 machine, I got this error: Similarly, on our SQL Server 10 machine, I got this error: The messages, though worded different, probably say the same thing: I need to authenticate, somehow. We have an Active Directory, but it is on yet another server. What, exactly, should be done here? A guy HERE<< said to check the Security settings, but did not say what else to do. Both servers are set to SQL Server and Windows Authentication mode. Now what?

    Read the article

  • Migrate AD DS Server 2003 to Server 2008 R2

    - by user2566483
    I would like to get a couple opinions Found this article online and wanted to know if it is good to follow http://www.msserverpro.com/migrating-active-directory-domain-controller-from-windows-server-2003-sp2-to-windows-server-2008-r2/ Couple of things that need to be done. 1. Move over all active directory settings from old Server 2003 server to new Server 2008R2 2. Setup all users on new server using csvde. csvde -f output.csv -- on old server csvde -i -f output.csv -- on new server Any advice would be greatly appreciated.

    Read the article

  • Partitioned Repository for WebCenter Content using Oracle Database 11g

    - by Adao Junior
    One of the biggest challenges for content management solutions is related to the storage management due the high volumes of the unstoppable growing of information. Even if you have storage appliances and a lot of terabytes, thinks like backup, compression, deduplication, storage relocation, encryption, availability could be a nightmare. One standard option that you have with the Oracle WebCenter Content is to store data to the database. And the Oracle Database allows you leverage features like compression, deduplication, encryption and seamless backup. But with a huge volume, the challenge is passed to the DBA to keep the WebCenter Content Database up and running. One solution is the use of DB partitions for your content repository, but what are the implications of this? Can I fit this with my business requirements? Well, yes. It’s up to you how you will manage that, you just need a good plan. During you “storage brainstorm plan” take in your mind what you need, such as storage petabytes of documents? You need everything on-line? There’s a way to logically separate the “good content” from the “legacy content”? The first thing that comes to my mind is to use the creation date of the document, but you need to remember that this document could receive a lot of revisions and maybe you can consider the revision creation date. Your plan can have also complex rules like per Document Type or per a custom metadata like department or an hybrid per date, per DocType and an specific virtual folder. Extrapolation the use, you can have your repository distributed in different servers, different disks, different disk types (Such as ssds, sas, sata, tape,…), separated accordingly your business requirements, separating the “hot” content from the legacy and easily matching your compliance requirements. If you think to use by revision, the simple way is to consider the dId, that is the sequential unique id for every content created using the WebCenter Content or the dLastModified that is the date field of the FileStorage table that contains the date of inclusion of the content to the DB Table using SecureFiles. Using the scenario of partitioned repository using an hierarchical separation by date, we will transform the FileStorage table in an partitioned table using  “Partition by Range” of the dLastModified column (You can use the dId or a join with other tables for other metadata such as dDocType, Security, etc…). The test scenario bellow covers: Previous existent data on the JDBC Storage to be migrated to the new partitioned JDBC Storage Partition by Date Automatically generation of new partitions based on a pre-defined interval (Available only with Oracle Database 11g+) Deduplication and Compression for legacy data Oracle WebCenter Content 11g PS5 (Could present some customizations that do not affect the test scenario) For the test case you need some data stored using JDBC Storage to be the “legacy” data. If you do not have done before, just create an Storage rule pointed to the JDBC Storage: Enable the metadata StorageRule in the UI and upload some documents using this rule. For this test case you can run using the schema owner or an dba user. We will use the schema owner TESTS_OCS. I can’t forgot to tell that this is just a test and you should do a proper backup of your environment. When you use the schema owner, you need some privileges, using the dba user grant the privileges needed: REM Grant privileges required for online redefinition. GRANT EXECUTE ON DBMS_REDEFINITION TO TESTS_OCS; GRANT ALTER ANY TABLE TO TESTS_OCS; GRANT DROP ANY TABLE TO TESTS_OCS; GRANT LOCK ANY TABLE TO TESTS_OCS; GRANT CREATE ANY TABLE TO TESTS_OCS; GRANT SELECT ANY TABLE TO TESTS_OCS; REM Privileges required to perform cloning of dependent objects. GRANT CREATE ANY TRIGGER TO TESTS_OCS; GRANT CREATE ANY INDEX TO TESTS_OCS; In our test scenario we will separate the content as Legacy, Day1, Day2, Day3 and Future. This last one will partitioned automatically using 3 tablespaces in a round robin mode. In a real scenario the partition rule could be per month, per year or any rule that you choose. Table spaces for the test scenario: CREATE TABLESPACE TESTS_OCS_PART_LEGACY DATAFILE 'tests_ocs_part_legacy.dat' SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; CREATE TABLESPACE TESTS_OCS_PART_DAY1 DATAFILE 'tests_ocs_part_day1.dat' SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; CREATE TABLESPACE TESTS_OCS_PART_DAY2 DATAFILE 'tests_ocs_part_day2.dat' SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; CREATE TABLESPACE TESTS_OCS_PART_DAY3 DATAFILE 'tests_ocs_part_day3.dat' SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; CREATE TABLESPACE TESTS_OCS_PART_ROUND_ROBIN_A 'tests_ocs_part_round_robin_a.dat' DATAFILE SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; CREATE TABLESPACE TESTS_OCS_PART_ROUND_ROBIN_B 'tests_ocs_part_round_robin_b.dat' DATAFILE SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; CREATE TABLESPACE TESTS_OCS_PART_ROUND_ROBIN_C 'tests_ocs_part_round_robin_c.dat' DATAFILE SIZE 500K AUTOEXTEND ON NEXT 500K MAXSIZE UNLIMITED; Before start, gather optimizer statistics on the actual FileStorage table: EXEC DBMS_STATS.GATHER_TABLE_STATS(USER, 'FileStorage', cascade => TRUE); Now check if is possible execute the redefinition process: EXEC DBMS_REDEFINITION.CAN_REDEF_TABLE('TESTS_OCS', 'FileStorage',DBMS_REDEFINITION.CONS_USE_PK); If no errors messages, you are good to go. Create a Partitioned Interim FileStorage table. You need to create a new table with the partition information to act as an interim table: CREATE TABLE FILESTORAGE_Part ( DID NUMBER(*,0) NOT NULL ENABLE, DRENDITIONID VARCHAR2(30 CHAR) NOT NULL ENABLE, DLASTMODIFIED TIMESTAMP (6), DFILESIZE NUMBER(*,0), DISDELETED VARCHAR2(1 CHAR), BFILEDATA BLOB ) LOB (BFILEDATA) STORE AS SECUREFILE ( ENABLE STORAGE IN ROW NOCACHE LOGGING KEEP_DUPLICATES NOCOMPRESS ) PARTITION BY RANGE (DLASTMODIFIED) INTERVAL (NUMTODSINTERVAL(1,'DAY')) STORE IN (TESTS_OCS_PART_ROUND_ROBIN_A, TESTS_OCS_PART_ROUND_ROBIN_B, TESTS_OCS_PART_ROUND_ROBIN_C) ( PARTITION FILESTORAGE_PART_LEGACY VALUES LESS THAN (TO_DATE('05-APR-2012 12.00.00 AM', 'DD-MON-YYYY HH.MI.SS AM')) TABLESPACE TESTS_OCS_PART_LEGACY LOB (BFILEDATA) STORE AS SECUREFILE ( TABLESPACE TESTS_OCS_PART_LEGACY RETENTION NONE DEDUPLICATE COMPRESS HIGH ), PARTITION FILESTORAGE_PART_DAY1 VALUES LESS THAN (TO_DATE('06-APR-2012 07.25.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) TABLESPACE TESTS_OCS_PART_DAY1 LOB (BFILEDATA) STORE AS SECUREFILE ( TABLESPACE TESTS_OCS_PART_DAY1 RETENTION AUTO KEEP_DUPLICATES COMPRESS ), PARTITION FILESTORAGE_PART_DAY2 VALUES LESS THAN (TO_DATE('06-APR-2012 07.55.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) TABLESPACE TESTS_OCS_PART_DAY2 LOB (BFILEDATA) STORE AS SECUREFILE ( TABLESPACE TESTS_OCS_PART_DAY2 RETENTION AUTO KEEP_DUPLICATES NOCOMPRESS ), PARTITION FILESTORAGE_PART_DAY3 VALUES LESS THAN (TO_DATE('06-APR-2012 07.58.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) TABLESPACE TESTS_OCS_PART_DAY3 LOB (BFILEDATA) STORE AS SECUREFILE ( TABLESPACE TESTS_OCS_PART_DAY3 RETENTION AUTO KEEP_DUPLICATES NOCOMPRESS ) ); After the creation you should see your partitions defined. Note that only the fixed range partitions have been created, none of the interval partition have been created. Start the redefinition process: BEGIN DBMS_REDEFINITION.START_REDEF_TABLE( uname => 'TESTS_OCS' ,orig_table => 'FileStorage' ,int_table => 'FileStorage_PART' ,col_mapping => NULL ,options_flag => DBMS_REDEFINITION.CONS_USE_PK ); END; This operation can take some time to complete, depending how many contents that you have and on the size of the table. Using the DBA user you can check the progress with this command: SELECT * FROM v$sesstat WHERE sid = 1; Copy dependent objects: DECLARE redefinition_errors PLS_INTEGER := 0; BEGIN DBMS_REDEFINITION.COPY_TABLE_DEPENDENTS( uname => 'TESTS_OCS' ,orig_table => 'FileStorage' ,int_table => 'FileStorage_PART' ,copy_indexes => DBMS_REDEFINITION.CONS_ORIG_PARAMS ,copy_triggers => TRUE ,copy_constraints => TRUE ,copy_privileges => TRUE ,ignore_errors => TRUE ,num_errors => redefinition_errors ,copy_statistics => FALSE ,copy_mvlog => FALSE ); IF (redefinition_errors > 0) THEN DBMS_OUTPUT.PUT_LINE('>>> FileStorage to FileStorage_PART temp copy Errors: ' || TO_CHAR(redefinition_errors)); END IF; END; With the DBA user, verify that there's no errors: SELECT object_name, base_table_name, ddl_txt FROM DBA_REDEFINITION_ERRORS; *Note that will show 2 lines related to the constrains, this is expected. Synchronize the interim table FileStorage_PART: BEGIN DBMS_REDEFINITION.SYNC_INTERIM_TABLE( uname => 'TESTS_OCS', orig_table => 'FileStorage', int_table => 'FileStorage_PART'); END; Gather statistics on the new table: EXEC DBMS_STATS.GATHER_TABLE_STATS(USER, 'FileStorage_PART', cascade => TRUE); Complete the redefinition: BEGIN DBMS_REDEFINITION.FINISH_REDEF_TABLE( uname => 'TESTS_OCS', orig_table => 'FileStorage', int_table => 'FileStorage_PART'); END; During the execution the FileStorage table is locked in exclusive mode until finish the operation. After the last command the FileStorage table is partitioned. If you have contents out of the range partition, you should see the new partitions created automatically, not generating an error if you “forgot” to create all the future ranges. You will see something like: You now can drop the FileStorage_PART table: border-bottom-width: 1px; border-bottom-style: solid; text-align: left; border-left-color: silver; border-left-width: 1px; border-left-style: solid; padding-bottom: 4px; line-height: 12pt; background-color: #f4f4f4; margin-top: 20px; margin-right: 0px; margin-bottom: 10px; margin-left: 0px; padding-left: 4px; width: 97.5%; padding-right: 4px; font-family: 'Courier New', Courier, monospace; direction: ltr; max-height: 200px; font-size: 8pt; overflow-x: auto; overflow-y: auto; border-top-color: silver; border-top-width: 1px; border-top-style: solid; cursor: text; border-right-color: silver; border-right-width: 1px; border-right-style: solid; padding-top: 4px; " id="codeSnippetWrapper"> DROP TABLE FileStorage_PART PURGE; To check the FileStorage table is valid and is partitioned, use the command: SELECT num_rows,partitioned FROM user_tables WHERE table_name = 'FILESTORAGE'; You can list the contents of the FileStorage table in a specific partition, per example: SELECT * FROM FileStorage PARTITION (FILESTORAGE_PART_LEGACY) Some useful commands that you can use to check the partitions, note that you need to run using a DBA user: SELECT * FROM DBA_TAB_PARTITIONS WHERE table_name = 'FILESTORAGE';   SELECT * FROM DBA_TABLESPACES WHERE tablespace_name like 'TESTS_OCS%'; After the redefinition process complete you have a new FileStorage table storing all content that has the Storage rule pointed to the JDBC Storage and partitioned using the rule set during the creation of the temporary interim FileStorage_PART table. At this point you can test the WebCenter Content downloading the documents (Original and Renditions). Note that the content could be already in the cache area, take a look in the weblayout directory to see if a file with the same id is there, then click on the web rendition of your test file and see if have created the file and you can open, this means that is all working. The redefinition process can be repeated many times, this allow you test what the better layout, over and over again. Now some interesting maintenance actions related to the partitions: Make an tablespace read only. No issues viewing, the WebCenter Content do not alter the revisions When try to delete an content that is part of an read only tablespace, an error will occurs and the document will not be deleted The only way to prevent errors today is creating an custom component that checks the partitions and if you have an document in an “Read Only” repository, execute the deletion process of the metadata and mark the document to be deleted on the next db maintenance, like a new redefinition. Take an tablespace off-line for archiving purposes or any other reason. When you try open an document that is included in this tablespace will receive an error that was unable to retrieve the content, but the others online tablespaces are not affected. Same behavior when deleting documents. Again, an custom component is the solution. If you have an document “out of range”, the component can show an message that the repository for that document is offline. This can be extended to a option to the user to request to put online again. Moving some legacy content to an offline repository (table) using the Exchange option to move the content from one partition to a empty nonpartitioned table like FileStorage_LEGACY. Note that this option will remove the registers from the FileStorage and will not be able to open the stored content. You always need to keep in mind the indexes and constrains. An redefinition separating the original content (vault) from the renditions and separate by date ate the same time. This could be an option for DAM environments that want to have an special place for the renditions and put the original files in a storage with less performance. The process will be the same, you just need to change the script of the interim table to use composite partitioning. Will be something like: CREATE TABLE FILESTORAGE_RenditionPart ( DID NUMBER(*,0) NOT NULL ENABLE, DRENDITIONID VARCHAR2(30 CHAR) NOT NULL ENABLE, DLASTMODIFIED TIMESTAMP (6), DFILESIZE NUMBER(*,0), DISDELETED VARCHAR2(1 CHAR), BFILEDATA BLOB ) LOB (BFILEDATA) STORE AS SECUREFILE ( ENABLE STORAGE IN ROW NOCACHE LOGGING KEEP_DUPLICATES NOCOMPRESS ) PARTITION BY LIST (DRENDITIONID) SUBPARTITION BY RANGE (DLASTMODIFIED) ( PARTITION Vault VALUES ('primaryFile') ( SUBPARTITION FILESTORAGE_VAULT_LEGACY VALUES LESS THAN (TO_DATE('05-APR-2012 12.00.00 AM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_VAULT_DAY1 VALUES LESS THAN (TO_DATE('06-APR-2012 07.25.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_VAULT_DAY2 VALUES LESS THAN (TO_DATE('06-APR-2012 07.55.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_VAULT_DAY3 VALUES LESS THAN (TO_DATE('06-APR-2012 07.58.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_VAULT_FUTURE VALUES LESS THAN (MAXVALUE) ) ,PARTITION WebLayout VALUES ('webViewableFile') ( SUBPARTITION FILESTORAGE_WEBLAYOUT_LEGACY VALUES LESS THAN (TO_DATE('05-APR-2012 12.00.00 AM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_WEBLAYOUT_DAY1 VALUES LESS THAN (TO_DATE('06-APR-2012 07.25.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_WEBLAYOUT_DAY2 VALUES LESS THAN (TO_DATE('06-APR-2012 07.55.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_WEBLAYOUT_DAY3 VALUES LESS THAN (TO_DATE('06-APR-2012 07.58.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_WEBLAYOUT_FUTURE VALUES LESS THAN (MAXVALUE) ) ,PARTITION Special VALUES ('Special') ( SUBPARTITION FILESTORAGE_SPECIAL_LEGACY VALUES LESS THAN (TO_DATE('05-APR-2012 12.00.00 AM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_SPECIAL_DAY1 VALUES LESS THAN (TO_DATE('06-APR-2012 07.25.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_SPECIAL_DAY2 VALUES LESS THAN (TO_DATE('06-APR-2012 07.55.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_SPECIAL_DAY3 VALUES LESS THAN (TO_DATE('06-APR-2012 07.58.00 PM', 'DD-MON-YYYY HH.MI.SS AM')) LOB (BFILEDATA) STORE AS SECUREFILE , SUBPARTITION FILESTORAGE_SPECIAL_FUTURE VALUES LESS THAN (MAXVALUE) ) )ENABLE ROW MOVEMENT; The next post related to partitioned repository will come with an sample component to handle the possible exceptions when you need to take off line an tablespace/partition or move to another place. Also, we can include some integration to the Retention Management and Records Management. Another subject related to partitioning is the ability to create an FileStore Provider pointed to a different database, raising the level of the distributed storage vs. performance. Let us know if this is important to you or you have an use case not listed, leave a comment. Cross-posted on the blog.ContentrA.com

    Read the article

  • SQL SERVER – PHP on Windows and SQL Server Training Kit

    - by pinaldave
    The PHP on Windows and SQL Server Training Kit includes a comprehensive set of technical content including demos and hands-on labs to help you understand how to build PHP applications using Windows, IIS 7.5 and SQL Server 2008 R2. This release includes the following: PHP & SQL Server Demos Integrating SQL Server Geo-Spatial with PHP SQL Server Reporting Services and PHP PHP & SQL Server Hands On Labs Introduction to Using SQL Server with PHP Using SQL Server Full-Text Search and FILESTREAM Storage with PHP New: Getting Started with SQL Server Migration Assistant for MySQL Download SQL Server PHP on Windows and SQL Server Training Kit Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Determine if SSRS 2012 is Installed on your SQL Server

    - by Pinal Dave
    This example is from the Beginning SSRS by Kathi Kellenberger. Supporting files are available with a free download from the www.Joes2Pros.com web site. Determine if SSRS 2012 is Installed on your SQL Server You may already have SSRS, or you may need to install it. Before doing any installation it makes sense to know where you are now. If you happened to install SQL Server with all features, you have the tools you need. There are two tools you need: SQL Server Data Tools and Reporting Services installed in Native Mode. To find out if SQL Server Data Tools (SSDT) is installed, click the Start button, go to All Programs, and expand SQL Server 2012. Look for SQL Server Data Tools   Now, let’s check to see if SQL Server Reporting Services is installed. Click the Start > All Programs > SQL Server 2012 > Configuration Tools > SQL > Server Configuration Manager   Once Configuration Manager is running, select SQL Server Services. Look for SQL Server Reporting Services in the list of services installed. If you have both SQL Server Reporting Services service and SQL Server Developer tools installed, you will not have to install them again. You may have SQL Server installed, but are missing the Data Tools or the SSRS service or both. In tomorrow blog post we will go over how to install based on where you are now.   Tomorrow’s Post Tomorrow’s blog post will show how to install and configure SSRS. If you want to learn SSRS in easy to simple words – I strongly recommend you to get Beginning SSRS book from Joes 2 Pros. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Reporting Services, SSRS

    Read the article

  • Make Offscreen Sliding Content Without Hurting SEO [duplicate]

    - by etangins
    This question already has an answer here: How bad is it to use display: none in CSS? 5 answers On my website I have content which is positioned off the screen, and then slides in when you click a button. For example, when you click the news button, content slides in with news. It didn't occur to me that this might be labeled as a black hat SEO technique, because I have content positioned off the screen with CSS that links elsewhere on my site, and a search engine could very easily interpret that as me hiding content for SEO purposes by positioning it off screen. Obviously, my intention was not to hide content, but was to make a sort of UI/UX content slider where content slides into view when a button is clicked. How can I make something to this effect (where content slides in and out), that would not comprise SEO?

    Read the article

  • Becoming the well-integrated content company (and combating AIUTLVFS)

    - by Lance Shaw
    Every single day, each of us create more and more content. Sometimes it is brand new material and many times it is iterations of existing content, but no one would argue that information and content growth is growing at an almost exponential rate. With all this content being created and stored, a number of problems naturally arise. One of the most common issues that users run into is "Am I Using The Latest Version of this File Syndrome", or AIUTLVFS. This insidious syndrome is all too common and results in ineffective, poor or downright wrong business decisions being made.  When content or files are unavailable or incorrect within the scope of key business processes, the chance for erroneous and costly business decisions is magnified even further. For many companies, the ideal scenario is to be able to connect multiple business systems, both old and new, into one common content repository.  Not only does this reduce content duplication, it also helps guarantee that everyone in various departments is working off the proverbial "same page".  Sounds simple - but for many organizations, the proliferation of file shares, SharePoint sites, and other storage silos of content keep the dream of a more efficient business a distant one. We've created some online assets to help you in your evaluation and eventual improvement of your current content management and delivery systems. Take a few minutes to check out our Online Assessment Tool.  It's quick, easy and just might provide you with insights into how you can improve your current content ecosystem. While you are there, check out our new Infographic that outlines common issues faced by companies today. Feel free to save our informative Infographic PDF and share it with business colleagues and your management to help them understand the business costs and impact of inaction. Together we can stop AIUTLVFS in its tracks and run our businesses more effectively than ever. Additionally, we hope you will take a few minutes to visit our new and informative webpages dedicated to the value of a well connected, fully integrated content management system. It's a great place to learn more about how integrating WebCenter Content into your infrastructure can lower your operational costs while boosting process and worker efficiency.

    Read the article

  • How to sysprep SQL Server Express?

    - by Jim
    We plan to deploy Hyper-V VHD with Windows Server 2008 R2 and SQL Server 2012 Express installed to multiple hosts. From my understanding, the correct way to do this is to install SQL Server in prepartion mode, sysprep Windows, then complete SQL Server installation when the VHD is deployed. I mostly followed the process in this blog post: http://sethusrinivasan.com/category/sysprep/ However, after the VHD is deployed, I'm unable to complete the SQL Server installation. It keeps saying "Upgrade matrix is incorrect". It seems that it's trying to upgrade itself to Enterprise edition (I was asked for product key during install, but I skipped it). Could anyone share their experience in deploying VHDs with SQL Server (we're fine with either SQL Server 2008 R2 or 2012)? I think the source of my issue is because I can't select "Express Edition" when entering the product key at the completion stage, so the installation is trying to do an upgrade to Enterprise Edition. I have no idea why the drop down list is empty.

    Read the article

  • SQL SERVER – Guest Posts – Feodor Georgiev – The Context of Our Database Environment – Going Beyond the Internal SQL Server Waits – Wait Type – Day 21 of 28

    - by pinaldave
    This guest post is submitted by Feodor. Feodor Georgiev is a SQL Server database specialist with extensive experience of thinking both within and outside the box. He has wide experience of different systems and solutions in the fields of architecture, scalability, performance, etc. Feodor has experience with SQL Server 2000 and later versions, and is certified in SQL Server 2008. In this article Feodor explains the server-client-server process, and concentrated on the mutual waits between client and SQL Server. This is essential in grasping the concept of waits in a ‘global’ application plan. Recently I was asked to write a blog post about the wait statistics in SQL Server and since I had been thinking about writing it for quite some time now, here it is. It is a wide-spread idea that the wait statistics in SQL Server will tell you everything about your performance. Well, almost. Or should I say – barely. The reason for this is that SQL Server is always a part of a bigger system – there are always other players in the game: whether it is a client application, web service, any other kind of data import/export process and so on. In short, the SQL Server surroundings look like this: This means that SQL Server, aside from its internal waits, also depends on external waits and settings. As we can see in the picture above, SQL Server needs to have an interface in order to communicate with the surrounding clients over the network. For this communication, SQL Server uses protocol interfaces. I will not go into detail about which protocols are best, but you can read this article. Also, review the information about the TDS (Tabular data stream). As we all know, our system is only as fast as its slowest component. This means that when we look at our environment as a whole, the SQL Server might be a victim of external pressure, no matter how well we have tuned our database server performance. Let’s dive into an example: let’s say that we have a web server, hosting a web application which is using data from our SQL Server, hosted on another server. The network card of the web server for some reason is malfunctioning (think of a hardware failure, driver failure, or just improper setup) and does not send/receive data faster than 10Mbs. On the other end, our SQL Server will not be able to send/receive data at a faster rate either. This means that the application users will notify the support team and will say: “My data is coming very slow.” Now, let’s move on to a bit more exciting example: imagine that there is a similar setup as the example above – one web server and one database server, and the application is not using any stored procedure calls, but instead for every user request the application is sending 80kb query over the network to the SQL Server. (I really thought this does not happen in real life until I saw it one day.) So, what happens in this case? To make things worse, let’s say that the 80kb query text is submitted from the application to the SQL Server at least 100 times per minute, and as often as 300 times per minute in peak times. Here is what happens: in order for this query to reach the SQL Server, it will have to be broken into a of number network packets (according to the packet size settings) – and will travel over the network. On the other side, our SQL Server network card will receive the packets, will pass them to our network layer, the packets will get assembled, and eventually SQL Server will start processing the query – parsing, allegorizing, generating the query execution plan and so on. So far, we have already had a serious network overhead by waiting for the packets to reach our Database Engine. There will certainly be some processing overhead – until the database engine deals with the 80kb query and its 20 subqueries. The waits you see in the DMVs are actually collected from the point the query reaches the SQL Server and the packets are assembled. Let’s say that our query is processed and it finally returns 15000 rows. These rows have a certain size as well, depending on the data types returned. This means that the data will have converted to packages (depending on the network size package settings) and will have to reach the application server. There will also be waits, however, this time you will be able to see a wait type in the DMVs called ASYNC_NETWORK_IO. What this wait type indicates is that the client is not consuming the data fast enough and the network buffers are filling up. Recently Pinal Dave posted a blog on Client Statistics. What Client Statistics does is captures the physical flow characteristics of the query between the client(Management Studio, in this case) and the server and back to the client. As you see in the image, there are three categories: Query Profile Statistics, Network Statistics and Time Statistics. Number of server roundtrips–a roundtrip consists of a request sent to the server and a reply from the server to the client. For example, if your query has three select statements, and they are separated by ‘GO’ command, then there will be three different roundtrips. TDS Packets sent from the client – TDS (tabular data stream) is the language which SQL Server speaks, and in order for applications to communicate with SQL Server, they need to pack the requests in TDS packets. TDS Packets sent from the client is the number of packets sent from the client; in case the request is large, then it may need more buffers, and eventually might even need more server roundtrips. TDS packets received from server –is the TDS packets sent by the server to the client during the query execution. Bytes sent from client – is the volume of the data set to our SQL Server, measured in bytes; i.e. how big of a query we have sent to the SQL Server. This is why it is best to use stored procedures, since the reusable code (which already exists as an object in the SQL Server) will only be called as a name of procedure + parameters, and this will minimize the network pressure. Bytes received from server – is the amount of data the SQL Server has sent to the client, measured in bytes. Depending on the number of rows and the datatypes involved, this number will vary. But still, think about the network load when you request data from SQL Server. Client processing time – is the amount of time spent in milliseconds between the first received response packet and the last received response packet by the client. Wait time on server replies – is the time in milliseconds between the last request packet which left the client and the first response packet which came back from the server to the client. Total execution time – is the sum of client processing time and wait time on server replies (the SQL Server internal processing time) Here is an illustration of the Client-server communication model which should help you understand the mutual waits in a client-server environment. Keep in mind that a query with a large ‘wait time on server replies’ means the server took a long time to produce the very first row. This is usual on queries that have operators that need the entire sub-query to evaluate before they proceed (for example, sort and top operators). However, a query with a very short ‘wait time on server replies’ means that the query was able to return the first row fast. However a long ‘client processing time’ does not necessarily imply the client spent a lot of time processing and the server was blocked waiting on the client. It can simply mean that the server continued to return rows from the result and this is how long it took until the very last row was returned. The bottom line is that developers and DBAs should work together and think carefully of the resource utilization in the client-server environment. From experience I can say that so far I have seen only cases when the application developers and the Database developers are on their own and do not ask questions about the other party’s world. I would recommend using the Client Statistics tool during new development to track the performance of the queries, and also to find a synchronous way of utilizing resources between the client – server – client. Here is another example: think about similar setup as above, but add another server to the game. Let’s say that we keep our media on a separate server, and together with the data from our SQL Server we need to display some images on the webpage requested by our user. No matter how simple or complicated the logic to get the images is, if the images are 500kb each our users will get the page slowly and they will still think that there is something wrong with our data. Anyway, I don’t mean to get carried away too far from SQL Server. Instead, what I would like to say is that DBAs should also be aware of ‘the big picture’. I wrote a blog post a while back on this topic, and if you are interested, you can read it here about the big picture. And finally, here are some guidelines for monitoring the network performance and improving it: Run a trace and outline all queries that return more than 1000 rows (in Profiler you can actually filter and sort the captured trace by number of returned rows). This is not a set number; it is more of a guideline. The general thought is that no application user can consume that many rows at once. Ask yourself and your fellow-developers: ‘why?’. Monitor your network counters in Perfmon: Network Interface:Output queue length, Redirector:Network errors/sec, TCPv4: Segments retransmitted/sec and so on. Make sure to establish a good friendship with your network administrator (buy them coffee, for example J ) and get into a conversation about the network settings. Have them explain to you how the network cards are setup – are they standalone, are they ‘teamed’, what are the settings – full duplex and so on. Find some time to read a bit about networking. In this short blog post I hope I have turned your attention to ‘the big picture’ and the fact that there are other factors affecting our SQL Server, aside from its internal workings. As a further reading I would still highly recommend the Wait Stats series on this blog, also I would recommend you have the coffee break conversation with your network admin as soon as possible. This guest post is written by Feodor Georgiev. Read all the post in the Wait Types and Queue series. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, Readers Contribution, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Wait Stats, SQL Wait Types, T SQL

    Read the article

  • Replication: SQL Server 2008 Publisher with SQL Server Express 2005 Subscriber

    - by Jeremy
    Here is the setup: SQL Server 2008 Enterprise Server with a Merge Publication. SQL Server 2005 Express with pull subscription. There is no web or ftp setup. This is direct merge replication. Using the RMO objects from C#, I get a "class cannot be found." COM Error when accessing the MergePullSubscription.SynchronizationAgent property. I've tried with both the 2008 RMO dll's (version 10 dll's) and the 2005 RMO dll's (version 9 dll's). When trying to use replmerge.exe, I get the following: 2010-04-10 04:12:05.263 Microsoft SQL Server Merge Agent 9.00.1399.06 2010-04-10 04:12:05.294 Copyright (c) 2000 Microsoft Corporation 2010-04-10 04:12:05.294 2010-04-10 04:12:05.294 The timestamps prepended to the output lines are express ed in terms of UTC time. 2010-04-10 04:12:05.294 User-specified agent parameter values: -Publisher SUN -PublisherDB PRIMROSE -PublisherSecurityMode 1 -Publication PRIMROSE -Distributor SUN -DistributorSecurityMode 1 -Subscriber PVILLE\SQLEXPRESS -SubscriberSecurityMode 1 -SubscriberDB PRIMROSE -SubscriptionType 1 -DistributorLogin sa -DistributorPassword ********** -DistributorSecurityMode 0 -PublisherLogin sa -PublisherPassword ********** -PublisherSecurityMode 0 -SubscriberLogin sa -SubscriberPassword ********** -SubscriberSecurityMode 0 2010-04-10 04:12:05.325 Connecting to Subscriber 'PVILLE\SQLEXPRESS' 2010-04-10 04:12:05.481 Connecting to Distributor 'SUN' 2010-04-10 04:12:05.513 The version of SQL Server running at the Distributor(10. 0.2531.??????????????????) is not compatible with the version of SQL Server runn ing at the Subscriber(9.00.1399.???????L?L?LHL?L?L?L?,?). 2010-04-10 04:12:05.513 Category:NULL Source: Merge Process Number: -2147200979 Message: The version of SQL Server running at the Distributor(10.0.2531.???????? ??????????) is not compatible with the version of SQL Server running at the Subs criber(9.00.1399.???????L?L?LHL?L?L?L?,?). Any ideas?

    Read the article

  • SQL Server Master class winner

    - by Testas
     The winner of the SQL Server MasterClass competition courtesy of the UK SQL Server User Group and SQL Server Magazine!    Steve Hindmarsh     There is still time to register for the seminar yourself at:  www.regonline.co.uk/kimtrippsql     More information about the seminar     Where: Radisson Edwardian Heathrow Hotel, London  When: Thursday 17th June 2010  This one-day MasterClass will focus on many of the top issues companies face when implementing and maintaining a SQL Server-based solution. In the case where a company has no dedicated DBA, IT managers sometimes struggle to keep the data tier performing well and the data available. This can be especially troublesome when the development team is unfamiliar with the affect application design choices have on database performance. The Microsoft SQL Server MasterClass 2010 is presented by Paul S. Randal and Kimberly L. Tripp, two of the most experienced and respected people in the SQL Server world. Together they have over 30 years combined experience working with SQL Server in the field, and on the SQL Server product team itself. This is a unique opportunity to hear them present at a UK event which will: Debunk many of the ingrained misconceptions around SQL Server's behaviour    Show you disaster recovery techniques critical to preserving your company's life-blood - the data    Explain how a common application design pattern can wreak havoc in the database Walk through the top-10 points to follow around operations and maintenance for a well-performing and available data tier! Please Note: Agenda may be subject to change  Sessions Abstracts  KEYNOTE: Bridging the Gap Between Development and Production    Applications are commonly developed with little regard for how design choices will affect performance in production. This is often because developers don't realize the implications of their design on how SQL Server will be able to handle a high workload (e.g. blocking, fragmentation) and/or because there's no full-time trained DBA that can recognize production problems and help educate developers. The keynote sets the stage for the rest of the day. Discussing some of the issues that can arise, explaining how some can be avoided and highlighting some of the features in SQL 2008 that can help developers and DBAs make better use of SQL Server, and troubleshoot when things go wrong.   SESSION ONE: SQL Server Mythbusters  It's amazing how many myths and misconceptions have sprung up and persisted over the years about SQL Server - after many years helping people out on forums, newsgroups, and customer engagements, Paul and Kimberly have heard it all. Are there really non-logged operations? Can interrupting shrinks or rebuilds cause corruption? Can you override the server's MAXDOP setting? Will the server always do a table-scan to get a row count? Many myths lead to poor design choices and inappropriate maintenance practices so these are just a few of many, many myths that Paul and Kimberly will debunk in this fast-paced session on how SQL Server operates and should be managed and maintained.   SESSION TWO: Database Recovery Techniques Demo-Fest  Even if a company has a disaster recovery strategy in place, they need to practice to make sure that the plan will work when a disaster does strike. In this fast-paced demo session Paul and Kimberly will repeatedly do nasty things to databases and then show how they are recovered - demonstrating many techniques that can be used in production for disaster recovery. Not for the faint-hearted!   SESSION THREE: GUIDs: Use, Abuse, and How To Move Forward   Since the addition of the GUID (Microsoft’s implementation of the UUID), my life as a consultant and "tuner" has been busy. I’ve seen databases designed with GUID keys run fairly well with small workloads but completely fall over and fail because they just cannot scale. And, I know why GUIDs are chosen - it simplifies the handling of parent/child rows in your batches so you can reduce round-trips or avoid dealing with identity values. And, yes, sometimes it's even for distributed databases and/or security that GUIDs are chosen. I'm not entirely against ever using a GUID but overusing and abusing GUIDs just has to be stopped! Please, please, please let me give you better solutions and explanations on how to deal with your parent/child rows, round-trips and clustering keys!   SESSION 4: Essential Database Maintenance  In this session, Paul and Kimberly will run you through their top-ten database maintenance recommendations, with a lot of tips and tricks along the way. These are distilled from almost 30 years combined experience working with SQL Server customers and are geared towards making your databases more performant, more available, and more easily managed (to save you time!). Everything in this session will be practical and applicable to a wide variety of databases. Topics covered include: backups, shrinks, fragmentation, statistics, and much more! Focus will be on 2005 but we'll explain some of the key differences for 2000 and 2008 as well. Speaker Biographies     Kimberley L. Tripp Paul and Kimberly are a husband-and-wife team who own and run SQLskills.com, a world-renowned SQL Server consulting and training company. They are both SQL Server MVPs and Microsoft Regional Directors, with over 30 years of combined experience on SQL Server. Paul worked on the SQL Server team for nine years in development and management roles, writing many of the DBCC commands, and ultimately with responsibility for core Storage Engine for SQL Server 2008. Paul writes extensively on his blog (SQLskills.com/blogs/Paul) and for TechNet Magazine, for which he is also a Contributing Editor. Kimberly worked on the SQL Server team in the early 1990s as a tester and writer before leaving to found SQLskills and embrace her passion for teaching and consulting. Kimberly has been a staple at worldwide conferences since she first presented at TechEd in 1996, and she blogs at SQLskills.com/blogs/Kimberly. They have written Microsoft whitepapers and books for SQL Server 2000, 2005 and 2008, and are regular, top-rated presenters worldwide on database maintenance, high availability, disaster recovery, performance tuning, and SQL Server internals. Together they teach the SQL MCM certification and throughout Microsoft.In their spare time, they like to find frogfish in remote corners of the world.   Speaker Testimonials  "To call them good trainers is an epic understatement. They know how to deliver technical material in ways that illustrate it well. I had to stop Paul at one point and ask him how long it took to build a particular slide because the animations were so good at conveying a hard-to-describe process." "These are not beginner presenters, and they put an extreme amount of preparation and attention to detail into everything that they do. Completely, utterly professional." "When it comes to the instructors themselves, Kimberly and Paul simply have no equal. Not only are they both ultimate authorities, but they have endless enthusiasm about the material, and spot on delivery. If either ever got tired they never showed it, even after going all day and all week. We witnessed countless demos over the course of the week, some extremely involved, multi-step processes, and I can’t recall one that didn’t go the way it was supposed to." "You might think that with this extreme level of skill comes extreme levels of egotism and lack of patience. Nothing could be further from the truth. ... They simply know how to teach, and are approachable, humble, and patient." "The experience Paul and Kimberly have had with real live customers yields a lot more information and things to watch out for than you'd ever get from documentation alone." “Kimberly, I just wanted to send you an email to let you know how awesome you are! I have applied some of your indexing strategies to our website’s homegrown CMS and we are experiencing a significant performance increase. WOW....amazing tips delivered in an exciting way!  Thanks again” 

    Read the article

  • SQL SERVER – Another lesser known feature of SQL Server Management Studio 2012 – Guest Post by Balmukund Lakhani

    - by Pinal Dave
    This is a fantastic blog post from my dear friend Balmukund ( blog | twitter | facebook ). He had presented a fantastic session in our last UG and there were lots of requests from attendees that he blogs about it. Well, here is the blog post about the same very popular UG session. Let us read the entire blog post in the voice of the Balmukund himself. In one of my previous guest blog on SQL Authority, I wrote about “Additional Connection Parameter” tab of login screen in SQL Server Management Studio (a.k.a. SSMS). On the similar lines, this blog is going to show little less known new feature of login main screen (“Connect to Server”) of SSMS 2012. You might have seen below screen countless times and you might wonder what is there is blog about in this simple screen. Well, continue reading and you would get the answer. Many times, DBA have to login to production server from non-regular machine, may be a developer’s workstation. Once you login to SQL, do your work and close the management studio. Do you know that your server name is saved in management studio? Of course, very useful feature because you may not like to type server name/IP address every time. Whatever servers you have connected, it would be stored by management studio. But sometime, it’s annoying! What you would do if you want SQL Server Management Studio to forget “all” the servers listed in drop down of Server name? To do that, you need to know how and where it’s stored. You can use one of my favorite tool from sysinternals called Process Monitor (also known as ProcMon) and easily figure out that this is stored in a file under your windows user profile. Below is the file in SQL 2008 R2 Management Studio. %appdata%\Microsoft\Microsoft SQL Server\100\Tools\Shell\SqlStudio.bin For SQL Server 2012, here is what we can see in ProcMon So, the path is %appdata%\Microsoft\Microsoft SQL Server\110\Tools\Shell\SqlStudio.bin So far, you might wonder, where is the new feature? I have been asked by many users to delete entries from SSMS “Connect to Server” server name list. Well, unofficially, you can delete the file directly which we found via ProcMon. Note that delete file to get rid of server list is not officially supported by Microsoft. Better way to achieve this is provided in SSMS 2012. To delete the servers from the list, highlight the name we want to delete (via keyboard or mouse) and then press delete key via keyboard. We can’t be multi-select and has to be done one by one. We can delete as many entries we want. I have delete few from first screenshot taken and here is the modified version. This is not available in SQL 2008 R2 and its previous version. This came from feedback given to SQL Server Product group. Hope you have learned something new today! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • The Minimalist Approach to Content Governance - Create Phase

    - by Kellsey Ruppel
     Originally posted by John Brunswick. In this installment of our Minimalist Approach to Content Governance we finally get to the fun part of the content creation process! Once the content requester has addressed the items outlined in the Request Phase it is time to setup and begin the production of content.   For this to be done correctly it is important the the content be assigned appropriate workflow and security information. As in our prior phase, let's take a look at what can be done to streamline this process - as contributors are focused on getting information to their end users as quickly as possible. This often means that details around how to ensure that the materials are properly managed can be overlooked, but fortunately there are some techniques that leverage our content management system's native capabilities to automatically take care of some of the details. 1. Determine Access Why - Even if content is not something that needs to restricted due to security reasons, it is helpful to apply access rights so that the content ends up being visible only to users that it relates to. This will greatly improve user experience. For instance, if your team is working on a group project many of your fellow company employees do not need to see the content that is being worked on for that project. How - Make use of native content features that allow propagation of security and meta data from parent folders within your content system that have been setup for your particular effort. This makes it painless to enforce security, as well as meta data policies for even the most unorganized users. The default settings at a parent level can be set once the content creation request has been accepted and a location in the content management system is assigned for your specific project. Impact - Users can find information will less effort, as they will only be exposed to what they need for their work and can leverage advanced search features to take advantage of meta data assigned to content. The combination of default security and meta data will also help in running reports against the content in the Manage and Retire stages that we will discuss in the next 2 posts. 2. Assign Workflow (optional depending on nature of content) Why - Every case for workflow is going to be a bit different, but it generally involves ensuring that content conforms to management, legal and or editorial requirements. How - Oracle's Universal Content Management offers two ways of helping to workflow content without much effort. Workflow can be applied to content based on Criteria acting on meta data or explicitly assigned to content with a Basic workflow. Impact - Any content that needs additional attention before release is addressed, allowing users to comment and version until a suitable result is reached. By using inheritance from parent folders within the content management system content can automatically be given the right security, meta data and workflow information for a particular project's content. This relieves the burden of doing this for every piece of content from management teams and content contributors. We will cover more about the management phase within the content lifecycle in our next installment.

    Read the article

  • Formatting data from management database

    - by bVector
    I've got some data that goes like this: Config_Name Question Answer Cisco WAN Sensitivity: High Cisco WAN Authorized Users: Brent, Charles Cisco WAN Last Audited: n/a Cisco WAN Next Audit: 3/30/2012 Cisco WAN Audit Signature: Cisco WAN Username: MYCOMPANY Cisco WAN Password: Cisco WAN Encrypted-A ENCRYPTED DATA Cisco WAN Encrypted-B Cisco WAN Encrypted-C vCenter server Sensitivity: High vCenter server Authorized Users: Brent, Charles vCenter server Last Audited: vCenter server Next Audit: 3/30/2012 vCenter server Audit Signature: ENCRYPTED DATA vCenter server Username: administrator vCenter server Password: vCenter server Encrypted-A ENCRYPTED DATA vCenter server Encrypted-B vCenter server Encrypted-C AKSC-NE01 IPMI Sensitivity: High AKSC-NE01 IPMI Authorized Users: Brent, Charles AKSC-NE01 IPMI Last Audited: AKSC-NE01 IPMI Next Audit: 3/30/2012 AKSC-NE01 IPMI Audit Signature: ENCRYPTED DATA AKSC-NE01 IPMI Username: MYCOMPANY AKSC-NE01 IPMI Password: AKSC-NE01 IPMI Encrypted-A ENCRYPTED DATA AKSC-NE01 IPMI Encrypted-B AKSC-NE01 IPMI Encrypted-C and I need it to be in this format: Config_Name Sensitivity: Authorized Users: Last Audited: Next Audit: Audit Signature: Username: Password: Encrypted-A Encrypted-B Encrypted-C AKSC-NE01 IPMI High Brent, Charles 3/30/2012 ENCRYPTED DATA MYCOMPANY ENCRYPTED DATA Cisco ASA5505 WAN High Brent, Charles n/a 3/30/2012 ENCRYPTED DATA MYCOMPANY ENCRYPTED DATA vCenter server High Brent, Charles 3/30/2012 ENCRYPTED DATA administrator ENCRYPTED DATA the tabs get messed up on here but hopefully you get my drift. does anyone know an easy way to do this? I haven't found one with excel just yet.

    Read the article

  • cannot connect to sql server express from sql server standard

    - by Jackson Sunuwar
    ... like my title says... I cannot connect to my instance on sql server express from sql server standard... I have tried disabling firing wall and checked sqlbrowser is started but for some reason I cannnot connect to my datbase... called server_name\sqlexpress.. I have a virtual machine and a full scale MS SQL Server 2008 R2 running on it... and I have several other vm running sqlexpress. they run fine and I can connect to them using sqlexpress... but when i try to access from sqlserver... I get this error. A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (Microsoft SQL Server, Error: -1) Digging deep into the error, I found this Error Number: -1 Severity: 20 State: 0 and finally this... Program Location: at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Connect(ServerInfo serverInfo, SqlInternalConnectionTds connHandler, Boolean ignoreSniOpenTimeout, Int64 timerExpire, Boolean encrypt, Boolean trustServerCert, Boolean integratedSecurity, SqlConnection owningObject) at System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(ServerInfo serverInfo, String newPassword, Boolean ignoreSniOpenTimeout, Int64 timerExpire, SqlConnection owningObject) at System.Data.SqlClient.SqlInternalConnectionTds.LoginNoFailover(String host, String newPassword, Boolean redirectedUserInstance, SqlConnection owningObject, SqlConnectionString connectionOptions, Int64 timerStart) at System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance) at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance) at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection) at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup) at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) at System.Data.SqlClient.SqlConnection.Open() at Microsoft.SqlServer.Management.SqlStudio.Explorer.ObjectExplorerService.ValidateConnection(UIConnectionInfo ci, IServerType server) at Microsoft.SqlServer.Management.UI.ConnectionDlg.Connector.ConnectionThreadUser() Firewall is turned off on the VM that's running mssqlserver... I turned of firewall on one of the vm that's running the sqlexpress but I still get the error... can someone please help... thank you

    Read the article

  • SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video

    - by pinaldave
    Importing data into database is one of the most important tasks. I often receive questions regarding what is the quickest way to insert CSV data or how to import CSV Data into SQL Server Table. Honestly the process is very simple and the script is even simpler. In today’s SQL in Sixty Seconds Video we will learn how quickly we can insert CSV data into SQL Server. The steps to import CSV are very simple. Create Table Use Bulk Insert to import the data Verify the data Done! Absolutely it is that simple. More on Importing CSV Data: SQL SERVER – Import CSV File Into SQL Server Using Bulk Insert – Load Comma Delimited File Into SQL Server SQL SERVER – Import CSV File into Database Table Using SSIS SQL SERVER – Create a Comma Delimited List Using SELECT Clause From Table Column SQL SERVER – Comma Separated Values (CSV) from Table Column SQL SERVER – Comma Separated Values (CSV) from Table Column – Part 2 I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • SQL SERVER – Remove Debug Button in SSMS – SQL in Sixty Seconds #020 – Video

    - by pinaldave
    SQL in Sixty Seconds is indeed tremendous fun to do. Every week, we try to come up with some new learning which we can share in Sixty Seconds. In this busy world, we all have sixty seconds to learn something new – no matter how much busy we are. In this episode of the series, we talk about another interesting feature of SQL Server Management Studio. In SQL Server Management Studio (SSMS) we have two button side by side. 1) Execute (!) and 2) Debug (>). It is quite confusing to a few developers. The debug button which looks like a play button encourages developers to click on the same thinking it will execute the code. Also developer with a Visual Studio background often click it because of their habit. However, Debug button is not the same as Execute button. In most of the cases developers want to click on Execute to run the query but by mistake they click on Debug and it wastes their valuable time. It is very easy to fix this. If developers are not frequently using a debug feature in SQL Server they should hide it from the toolbar itself. This will reduce the chances to incorrectly click on the debug button greatly as well save lots of time for developer as invoking debug processes and turning it off takes a few extra moments. In this Sixty second video we will discuss how one can hide the debug button and avoid confusion regarding execution button. I personally use function key F5 to execute the T-SQL code so I do not face this problem that often. More on Removing Debug Button in SSMS: SQL SERVER – Read Only Files and SQL Server Management Studio (SSMS) SQL SERVER – Standard Reports from SQL Server Management Studio – SQL in Sixty Seconds #016 – Video SQL SERVER – Discard Results After Query Execution – SSMS SQL SERVER – Tricks to Comment T-SQL in SSMS – SQL in Sixty Seconds #019 – Video SQL SERVER – Right Aligning Numerics in SQL Server Management Studio (SSMS) I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • Oracle-AmberPoint Webcast: Learn How Your Business Can Profit from the Combination

    - by jyothi.swaroop
    With the recent acquisition of AmberPoint, Oracle now offers an enhanced end-to-end SOA solution that features runtime governance, business transaction management, and cross-platform management capabilities. Put that solution to work and your business can achieve lower costs of implementation and higher profit. Join Ed Horst, Vice President, Oracle (former CMO of AmberPoint), and Ashish Mohindroo, Senior Director, Product Marketing, Oracle, as they discuss in this live Webcast the customer advantages of the Oracle and AmberPoint combination. Learn how our SOA solutions with AmberPoint capabilities can help you: Achieve more agility and visibility into your business processes Increase control and performance of critical applications Improve performance and reduce IT costs to benefit your bottom line Register for the Live Webcast Event Date: Thursday, May 20, 2010 Time: 10 a.m. PT/1 p.m. ET

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >