Search Results

Search found 34465 results on 1379 pages for 'database permissions'.

Page 137/1379 | < Previous Page | 133 134 135 136 137 138 139 140 141 142 143 144  | Next Page >

  • web services access not being reached thru the web browser [closed]

    - by Tony
    I am trying to reference my .asmx webservices in .NET but my server is not exposed to the internet. When I put on the following address I get the message mentioned below. What's the reason for not being able to see the directory? Am I missing something in my IIS configuraction? Am I missing anything in my permissions? Just as reference I have other folders with webservices and I have the same issue. When I login to the server I am doing it with my windows user and password (I am using windows authentication). It's necessary to mention that when I put the URL I am getting a popup screen to put in my userid and password but it seems that's not able to validate since keeps asking me a couple of times. Let me know if you need more information to address this issue . http://appsvr02/Inetpub/wwwroot/DevWebApi/ Internet Explorer cannot display the webpage What you can try: It appears you are connected to the Internet, but you might want to try to reconnect to the Internet. Retype the address. Go back to the previous page. Most likely causes: •You are not connected to the Internet. •The website is encountering problems. •There might be a typing error in the address. More information This problem can be caused by a variety of issues, including: •Internet connectivity has been lost. •The website is temporarily unavailable. •The Domain Name Server (DNS) is not reachable. •The Domain Name Server (DNS) does not have a listing for the website's domain. •If this is an HTTPS (secure) address, click tools, click Internet Options, click Advanced, and check to be sure the SSL and TLS protocols are enabled under the security section. For offline users You can still view subscribed feeds and some recently viewed webpages. To view subscribed feeds 1.Click the Favorites Center button , click Feeds, and then click the feed you want to view. To view recently visited webpages (might not work on all pages) 1.Click Tools , and then click Work Offline. 2.Click the Favorites Center button , click History, and then click the page you want to view.

    Read the article

  • How to overcome shortcomings in reporting from EAV database?

    - by David Archer
    The major shortcomings with Entity-Attribute-Value database designs in SQL all seem to be related to being able to query and report on the data efficiently and quickly. Most of the information I read on the subject warn against implementing EAV due to these problems and the commonality of querying/reporting for almost all applications. I am currently designing a system where almost all the fields necessary for data storage are not known at design/compile time and are defined by the end-user of the system. EAV seems like a good fit for this requirement but due to the problems I've read about, I am hesitant in implementing it as there are also some pretty heavy reporting requirements for this system as well. I think I've come up with a way around this but would like to pose the question to the SO community. Given that typical normalized database (OLTP) still isn't always the best option for running reports, a good practice seems to be having a "reporting" database (OLAP) where the data from the normalized database is copied to, indexed extensively, and possibly denormalized for easier querying. Could the same idea be used to work around the shortcomings of an EAV design? The main downside I see are the increased complexity of transferring the data from the EAV database to reporting as you may end up having to alter the tables in the reporting database as new fields are defined in the EAV database. But that is hardly impossible and seems to be an acceptable tradeoff for the increased flexibility given by the EAV design. This downside also exists if I use a non-SQL data store (i.e. CouchDB or similar) for the main data storage since all the standard reporting tools are expecting a SQL backend to query against. Do the issues with EAV systems mostly go away if you have a seperate reporting database for querying? EDIT: Thanks for the comments so far. One of the important things about the system I'm working on it that I'm really only talking about using EAV for one of the entities, not everything in the system. The whole gist of the system is to be able to pull data from multiple disparate sources that are not known ahead of time and crunch the data to come up with some "best known" data about a particular entity. So every "field" I'm dealing with is multi-valued and I'm also required to track history for each. The normalized design for this ends up being 1 table per field which makes querying it kind of painful anyway. Here are the table schemas and sample data I'm looking at (obviously changed from what I'm working on but I think it illustrates the point well): EAV Tables Person ------------------- - Id - Name - ------------------- - 123 - Joe Smith - ------------------- Person_Value ------------------------------------------------------------------- - PersonId - Source - Field - Value - EffectiveDate - ------------------------------------------------------------------- - 123 - CIA - HomeAddress - 123 Cherry Ln - 2010-03-26 - - 123 - DMV - HomeAddress - 561 Stoney Rd - 2010-02-15 - - 123 - FBI - HomeAddress - 676 Lancas Dr - 2010-03-01 - ------------------------------------------------------------------- Reporting Table Person_Denormalized ---------------------------------------------------------------------------------------- - Id - Name - HomeAddress - HomeAddress_Confidence - HomeAddress_EffectiveDate - ---------------------------------------------------------------------------------------- - 123 - Joe Smith - 123 Cherry Ln - 0.713 - 2010-03-26 - ---------------------------------------------------------------------------------------- Normalized Design Person ------------------- - Id - Name - ------------------- - 123 - Joe Smith - ------------------- Person_HomeAddress ------------------------------------------------------ - PersonId - Source - Value - Effective Date - ------------------------------------------------------ - 123 - CIA - 123 Cherry Ln - 2010-03-26 - - 123 - DMV - 561 Stoney Rd - 2010-02-15 - - 123 - FBI - 676 Lancas Dr - 2010-03-01 - ------------------------------------------------------ The "Confidence" field here is generated using logic that cannot be expressed easily (if at all) using SQL so my most common operation besides inserting new values will be pulling ALL data about a person for all fields so I can generate the record for the reporting table. This is actually easier in the EAV model as I can do a single query. In the normalized design, I end up having to do 1 query per field to avoid a massive cartesian product from joining them all together.

    Read the article

  • PerformancePoint dashboard permissions problem in MOSS

    - by Nathan DeWitt
    I have a PerformancePoint dashboard running in MOSS 2007 portal. The dashboard consists of one SSRS 2005 report, running in SharePoint Integrated mode. NT Authority\Authenticated Users have read permissions to the report library containing the SSRS report, the dashboard, and the report library containing the dashboard. Users that attempt to access the dashboard receive the following error message: The permissions granted to user 'DOMAIN\firstname.lastname' are insufficient for performing this operation. (rsAccessDenied) Users that then click on the direct link to the report in MOSS will see the report with no problem. Subsequent visits to the dashboard show the report with no problem. The report is using a data source that is located one folder up from the report location. The report has been updated to point to the correct shared data source after deployment. Both the report and the data source have been published. The data source is using stored credentials, with a domain service account that has been set to Use as Windows credentials. This service account is serving other reports in other areas with no problem. Edit: Ok, I've gotten a lot more information on this problem. The request is never actually being made to the data source. The user comes in to the dashboard and requests a report for the first time using their kerberos token identifying themselves. The report looks in the Report Server database and finds that they are not listed in the users table and generates this rsAccessDenied error. Once they view the report directly their name is in this table and they never have the problem again. Unfortunately, removing the user from the Users table in the RS database doesn't actually cause this error to happen again. Everything I've read says that when you run a Report Server in MOSS integrated mode all your permissions are handled at the MOSS report library level, and all Auth users have permissions to the report library, as stated earlier. Any ideas?

    Read the article

  • Facebook Connect: Permissions Error [200] using "stream.publish" with PHP

    - by Sarah
    Hi all, I've been implementing Facebook Connect into a site and am using both the PHP API, to allow me to automatically post data to a user's wall, as well as the JS API, for manual posting, permissions dialogs, etc. When the user uses the manual method it works 100%...the popups are displayed correctly, and the data gets posted to their wall properly. However, when I try to use the PHP API I am getting inconsistencies. When I try posting automatically using the PHP API using one account it works perfect, every time. But for some other accounts it never works, always returning "Permissions error." The error code is 200, and I've checked the Facebook API documentation and it's pretty vague, saying only "Permissions error. The application does not have permission to perform this action." But that's not true, since it works on some accounts and doesn't work on others. First, I've made sure that the users in question have enabled the extended permission "publish_stream" and that the manual method using the JS API works, so it doesn't seem to be a problem with those specific permissions. There are no apparent differences between the Facebook accounts I've used. So my question is has anyone run into this problem and found a solution to it? Is there some sort of other permission setting that users must enable for this to work? I've been searching Google and these forums but have not found any solution. The request I am sending is: (Note: The content/image url/link url are not the actual data I use) $attachment = array( 'caption' => '{*actor*} commented on <title> "<comment>"', 'media' => array( array( 'type' => 'image', 'src' => 'http://www.test.com/image.jpg', 'href' => 'http://www.test.com' ) ) ); $Facebook->api_client->stream_publish('', $attachment); Thanks, Sarah

    Read the article

  • silent failure while creating odbc data source

    - by Peter
    I just got really confused trying to create an ODBC data source in Windows 2003 R2. I can create a connection to my chosen server (a MS SQL Server) on the "user DSN" tab, but when I try to do the same thing on the "system DSN tab", the process fails but without an error message. I am able to connect to the target database fine at the end of configuring a new data source, but when I click OK, the data source just isn't there. No error message, no sign that anything went amiss, other than the lack of a new data source. Very annoying, as I had to repeat the process a few times to make sure I wasn't crazy. Anybody got any hints? I suspect it is a permission problem of some sort but since there is no error message, I don't know where to start.

    Read the article

  • FTP Upload works from local command line / remote GUI client but not from PHP script

    - by MrOodles
    I originally posted this question at StackOverflow, but I'm beginning to think it's more of a server question. I have installed ProFTPd on an EC2 instance running Ubuntu 10.10. I have managed my proftpd.conf file as well as my server permissions to be able to connect and upload/move files using FTP both remotely using Filezilla, and on the server itself when connecting to 127.0.0.1. The problem I'm running into is when I try to upload/install a file using Joomla's interface. I give Joomla the same login information that I give to Filezilla, and the connection is made in the same fashion. The ftp.log file actually shows that Joomla is able to login to the server: localhost UNKNOWN nobody [17/Jan/2011:14:09:17 +0000] "USER ftpuser" 331 - localhost UNKNOWN ftpuser [17/Jan/2011:14:09:17 +0000] "PASS (hidden)" 230 - localhost UNKNOWN ftpuser [17/Jan/2011:14:09:17 +0000] "PASV" 227 - localhost UNKNOWN ftpuser [17/Jan/2011:14:09:17 +0000] "TYPE I" 200 - localhost UNKNOWN ftpuser [17/Jan/2011:14:09:17 +0000] "STOR /directory/store/location/file.zip" 550 - But it fails when attempting the STOR command. I have traced the problem in the Joomla code to the PHP FTP module. The code (with my trace statements added): if (@ftp_put($this->_conn, $remote, $local, $mode) === false) { echo "\n FTP PUT failed."; echo "\n Remote: $remote ; Local: $local ; Mode: $mode - Either ASCII: ".FTP_ASCII." or Binary: ".FTP_BINARY; echo "\n The user: ".exec("whoami"); JError::raiseWarning('35', 'JFTP::store: Bad response' ); return false; } Trace ouputs: FTP PUT failed. Remote: /directory/store/location/file.zip ; Local: /tmp/phpwuccp4 ; Mode: 2 - Either ASCII: 1 or Binary: 2 The user: www-data And in case you were curious, here is an example of the FTP log when using Filezilla: my_client_ip UNKNOWN nobody [17/Jan/2011:16:45:55 +0000] "USER ftpuser" 331 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:45:55 +0000] "PASS (hidden)" 230 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:45:55 +0000] "OPTS UTF8 ON" - - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:45:55 +0000] "PWD" 257 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:45:55 +0000] "TYPE I" 200 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:45:55 +0000] "PASV" 227 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:45:55 +0000] "MLSD" 226 3405 my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:06 +0000] "CWD location" 250 3405 my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:06 +0000] "PWD" 257 3405 my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:06 +0000] "PASV" 227 3405 my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:07 +0000] "MLSD" 226 3757 my_client_ip UNKNOWN nobody [17/Jan/2011:16:46:37 +0000] "USER ftpuser" 331 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:37 +0000] "PASS (hidden)" 230 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:37 +0000] "OPTS UTF8 ON" - - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:37 +0000] "CWD /location" 250 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:37 +0000] "PWD" 257 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:37 +0000] "TYPE I" 200 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:37 +0000] "PASV" 227 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:39 +0000] "STOR file.zip" 226 125317 my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:39 +0000] "PASV" 227 - my_client_ip UNKNOWN ftpuser [17/Jan/2011:16:46:39 +0000] "MLSD" 226 497

    Read the article

  • SQL SERVER – Get All the Information of Database using sys.databases

    - by pinaldave
    Earlier I wrote blog article SQL SERVER – Finding Last Backup Time for All Database. In the response of this article I have received very interesting script from SQL Server Expert Matteo as a comment in the blog. He has written script using sys.databases which provides plenty of the information about database. I suggest you can run this on your database and know unknown of your databases as well. SELECT database_id, CONVERT(VARCHAR(25), DB.name) AS dbName, CONVERT(VARCHAR(10), DATABASEPROPERTYEX(name, 'status')) AS [Status], state_desc, (SELECT COUNT(1) FROM sys.master_files WHERE DB_NAME(database_id) = DB.name AND type_desc = 'rows') AS DataFiles, (SELECT SUM((size*8)/1024) FROM sys.master_files WHERE DB_NAME(database_id) = DB.name AND type_desc = 'rows') AS [Data MB], (SELECT COUNT(1) FROM sys.master_files WHERE DB_NAME(database_id) = DB.name AND type_desc = 'log') AS LogFiles, (SELECT SUM((size*8)/1024) FROM sys.master_files WHERE DB_NAME(database_id) = DB.name AND type_desc = 'log') AS [Log MB], user_access_desc AS [User access], recovery_model_desc AS [Recovery model], CASE compatibility_level WHEN 60 THEN '60 (SQL Server 6.0)' WHEN 65 THEN '65 (SQL Server 6.5)' WHEN 70 THEN '70 (SQL Server 7.0)' WHEN 80 THEN '80 (SQL Server 2000)' WHEN 90 THEN '90 (SQL Server 2005)' WHEN 100 THEN '100 (SQL Server 2008)' END AS [compatibility level], CONVERT(VARCHAR(20), create_date, 103) + ' ' + CONVERT(VARCHAR(20), create_date, 108) AS [Creation date], -- last backup ISNULL((SELECT TOP 1 CASE TYPE WHEN 'D' THEN 'Full' WHEN 'I' THEN 'Differential' WHEN 'L' THEN 'Transaction log' END + ' – ' + LTRIM(ISNULL(STR(ABS(DATEDIFF(DAY, GETDATE(),Backup_finish_date))) + ' days ago', 'NEVER')) + ' – ' + CONVERT(VARCHAR(20), backup_start_date, 103) + ' ' + CONVERT(VARCHAR(20), backup_start_date, 108) + ' – ' + CONVERT(VARCHAR(20), backup_finish_date, 103) + ' ' + CONVERT(VARCHAR(20), backup_finish_date, 108) + ' (' + CAST(DATEDIFF(second, BK.backup_start_date, BK.backup_finish_date) AS VARCHAR(4)) + ' ' + 'seconds)' FROM msdb..backupset BK WHERE BK.database_name = DB.name ORDER BY backup_set_id DESC),'-') AS [Last backup], CASE WHEN is_fulltext_enabled = 1 THEN 'Fulltext enabled' ELSE '' END AS [fulltext], CASE WHEN is_auto_close_on = 1 THEN 'autoclose' ELSE '' END AS [autoclose], page_verify_option_desc AS [page verify option], CASE WHEN is_read_only = 1 THEN 'read only' ELSE '' END AS [read only], CASE WHEN is_auto_shrink_on = 1 THEN 'autoshrink' ELSE '' END AS [autoshrink], CASE WHEN is_auto_create_stats_on = 1 THEN 'auto create statistics' ELSE '' END AS [auto create statistics], CASE WHEN is_auto_update_stats_on = 1 THEN 'auto update statistics' ELSE '' END AS [auto update statistics], CASE WHEN is_in_standby = 1 THEN 'standby' ELSE '' END AS [standby], CASE WHEN is_cleanly_shutdown = 1 THEN 'cleanly shutdown' ELSE '' END AS [cleanly shutdown] FROM sys.databases DB ORDER BY dbName, [Last backup] DESC, NAME Please let me know if you find this information useful. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, Readers Contribution, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • SQLAuthority News – Bookmark – Deprecated Database Engine Features in SQL Server 2008

    - by pinaldave
    When anybody asked me if any specific feature is available in SQL Server 2008 or if any feature will be disabled in future versions of SQL Server, I always point everybody to following list where all the deprecated database engine features are listed. Deprecated Database Engine Features in SQL Server 2008 R2 Deprecated Database Engine Features in SQL Server 2008 This list is quite helpful and everybody should refer it once. This list has many important details. For example, it suggests “80 compatibility level and upgrade from version 80.” will not be supported in next version of SQL Server. If you are using SQL Server 2000 still today (by any chance) you will be not able to upgrade that to next version of SQL Server directly. It is very important to note that if you are using any feature of SQL Server in compatibility mode and if you find them in the list above. You need to start working on the replacement suggested in article. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Bookmark, SQL, SQL Authority, SQL Documentation, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Free NOSQL database for use with C# client [closed]

    - by Mitten
    I've never used NOSQL databases before, but so far it seems like the best data storage solution for my project. I am going to implement a datamining application. The data I would like to mine is thousands of documents which cannot be imported into datamining applications. To make to import easier and faster (than importing thousands of documents) I am planning to import these documents into a NOSQL database first and when import NOSQL database into datamining software. At the very least once I have all the data in NOSQL database I should be able to code simplest datamining logic myself. Am I correct that NOSQL databases allow to creates records of data, but they don't mandate all the records to adhere to the same data schema (same column names/types in a classic table oriended SQL databases)? I think for each document I would create a row/entry/object (not sure what is the correct term is in use in NOSQL world) which would be a string id, few (columns) with unstructured text data, and a dozens of columns mostly of datetime and integer types. From its name NOSQL does not support SQL query syntax, but it support locating the object(row/entry?) by its unique id. Does NOSQL support qyuering objects using property=value syntax? Unfortunately most of free NOSQL db only support Java/C++ clients, which free NOSQL db would you recommend for a C# programmer?

    Read the article

  • SQLAuthority News – Mark the Date: October 16, 2013 – Introducing NuoDB Blackbirds: THE Distributed Database

    - by Pinal Dave
    I am very excited to announce first on this blog about the release of NuoDB Blackbirds (NuoDB Release 2.0). NuoDB is my favorite application to work with data now a days. They are increasingly gaining market share as well as brining out new features with their every new release. I was very excited when I learned that NuoDB is releasing their flagship release of 2.0 on October 16, 2013. Interesting enough I will be in USA while this release happens and I will be watching it live during my day time. Even though if I had to stay up the entire night to just watch this release, I would do it. Here is the details of the announcements: Introducing NuoDB Blackbirds: THE Distributed Database Date: October 16, 2013 Time: 1:00 PM EDT Location: Online Registration Link What is the best DBMS architecture to handle today’s and tomorrow’s evolving needs? The days of shared disk are over. The times are “a-changin” and IT infrastructure has to change with them. Join NuoDB live for the introduction of our latest major product release, NuoDB Blackbirds, and take a look at why the NuoDB distributed database architecture is the only answer for customers like Fathom Voice, a leading provider of Voice Over IP (VoIP). NuoDB CEO, Barry Morris, welcomes Cameron Weeks, CEO of Fathom Voice to discuss how his company is using DBMS to break away from the pack and become the hottest player in VoIP. The webcast will include demonstrations of a single, logical database running in multiple geographies and a live Q&A. If due to any reason, you cannot watch it live, do not worry at all, just register at this Registration Link, as after the event you will get the link to watch the event on-demand. You can watch the launch event at any time if you have registered for the launch. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: NuoDB

    Read the article

  • Partner Webcast – Oracle Exadata X3 Database In-Memory Machine - Next-Generation Technologies Update - 20 Dec 2012

    - by Thanos
    Oracle’s next-generation database machine, Oracle Exadata X3, combines massive memory and low-cost disks to deliver even faster performance and greater storage capabilities at the lowest cost, making it the ideal database platform for the varied and unpredictable workloads of cloud computing. Oracle Exadata is available in multiple configurations including a low-cost eighth-rack configuration, so you can start small and grow at your own pace. We have also introduced new migration services designed to streamline implementation thereby saving you time and money. If your IT department is expected to deliver business value—or even drive business growth—then you’ll want to join us for a live Webcast discussing how the new Oracle Exadata X3 can help you transform data management.  Agenda: Oracle Exadata Evolution Oracle Exadata X3 Database In-Memory Machine Hardware Update Software Update Exadata Unique Next Generation Technologies Getting on board Oracle Exadata Q&A Delivery Format This FREE online LIVE eSeminar will be delivered over the Web. Registrations received less than 24hours prior to start time may not receive confirmation to attend. Thursday, December 20th, 10am CET (9am GMT) Duration: 1 hour Register Now! For any questions please contact us at [email protected] Visit our ISV Migration Center blog Or Follow us @oracleimc to learn more on Oracle Technologies, upcoming partner webcasts and events. Existing content available YouTube - SlideShare - Oracle Mix.

    Read the article

  • Materialized View does not import properly when importing on a second instance of a database

    - by marinus
    When I import a database with materialized view mv_mt in just one database (Oracle) everything is ok. create materialized view mv_mt refresh complete next trunc( sysdate ) + 1 as SELECT sysdate, media_type.* from media_type; But when I try to import the same database to a copy in another schema I get the following errors: IMP-00017: following statement failed with ORACLE error 1: "BEGIN DBMS_JOB.ISUBMIT(JOB=438,WHAT='dbms_refresh.refresh(''"ALEXANDRA"" "."MV_MT"'');',NEXT_DATE=TO_DATE('2012-07-02:14:22:36','YYYY-MM-DD:HH24:MI:" "SS'),INTERVAL='sysdate + 1 / 24 / 60 / 6 ',NO_PARSE=TRUE); END;" IMP-00003: ORACLE error 1 encountered ORA-00001: unique constraint (SYS.I_JOB_JOB) violated ORA-06512: at "SYS.DBMS_JOB", line 100 ORA-06512: at line 1 IMP-00017: following statement failed with ORACLE error 23421: "BEGIN dbms_refresh.make('"ALEXANDRA"."MV_MT"',list=null,next_date=null," "interval=null,implicit_destroy=TRUE,lax=FALSE,job=438,rollback_seg=NUL" "L,push_deferred_rpc=TRUE,refresh_after_errors=FALSE,purge_option = 1,par" "allelism = 0,heap_size = 0); END;" IMP-00003: ORACLE error 23421 encountered ORA-23421: job number 438 is not a job in the job queue ORA-06512: at "SYS.DBMS_SYS_ERROR", line 86 ORA-06512: at "SYS.DBMS_IJOB", line 793 ORA-06512: at "SYS.DBMS_REFRESH", line 86 ORA-06512: at "SYS.DBMS_REFRESH", line 62 ORA-06512: at line 1 IMP-00017: following statement failed with ORACLE error 23410: "BEGIN dbms_refresh.add(name='"ALEXANDRA"."MV_MT"',list='"ALEXANDRA"."MV" "_MT"',siteid=0,export_db='ORCL01'); END;" IMP-00003: ORACLE error 23410 encountered ORA-23410: materialized view "ALEXANDRA"."MV_MT" is already in a refresh group ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95 ORA-06512: at "SYS.DBMS_IREFRESH", line 484 ORA-06512: at "SYS.DBMS_REFRESH", line 140 ORA-06512: at "SYS.DBMS_REFRESH", line 125 ORA-06512: at line 1 Anyone any ideas? Regards, Marinus

    Read the article

  • Database Backup History From MSDB in a pivot table

    - by steveh99999
    I knocked up a nice little query to display backup history for each database in a pivot table format.I wanted to display the most recent full, differential, and transaction log backup for each database. Here's the SQL :-WITH backupCTE AS (SELECT name, recovery_model_desc, d AS 'Last Full Backup', i AS 'Last Differential Backup', l AS 'Last Tlog Backup' FROM ( SELECT db.name, db.recovery_model_desc,type, backup_finish_date FROM master.sys.databases db LEFT OUTER JOIN msdb.dbo.backupset a ON a.database_name = db.name WHERE db.state_desc = 'ONLINE' ) AS Sourcetable   PIVOT (MAX (backup_finish_date) FOR type IN (D,I,L) ) AS MostRecentBackup ) SELECT * FROM backupCTE Gives output such as this :-  With this query, I can then build up some straightforward queries to ensure backups are scheduled and running as expected -For example, the following logic can be used ;-  - WHERE [Last Full Backup] IS NULL) - ie database has never been backed up.. - WHERE [Last Tlog Backup] < DATEDIFF(mm,GETDATE(),-60) AND recovery_model_desc <> 'SIMPLE') - transction log not backed up in last 60 minutes. - WHERE [Last Full Backup] < DATEDIFF(dd,GETDATE(),-1) AND [Last Differential Backup] < [Last Full Backup]) -- no backup in last day.- WHERE [Last Differential Backup] < DATEDIFF(dd,GETDATE(),-1) AND [Last Full Backup] < DATEDIFF(dd,GETDATE(),-8) ) -- no differential backup in last day when last full backup is over 8 days old.   

    Read the article

  • Customers Deploying Sun Oracle Database Machine

    - by kimberly.billings
    Philippine Savings Bank (PS Bank) recently deployed the Sun Oracle Database Machine to underpin its enterprise-wide analytics platform. Now, the response times for queries and requests that used to take from three hours to several days is completed in less than one minute with near real-time updates. Read the press release. EFU General Insurance also announced this week that they have deployed the Sun Oracle Database Machine. With Oracle, EFU will be able to open more sales channels via the Web and facilitate integration with other companies. As a result, more quality services can be offered to its customers via the Web because of the more agile and reliable IT infrastructure. In addition, a centralized IT environment will offer the EFU management a real time view of key information, enabling EFU to analyze business trends and make timely decisions. Read the press release. Let us know about your Sun Oracle Database deployment! var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Database Driven Web Application, C# Front-End and F# Back-End meaning

    - by user1473053
    Hi I am an intern working with ASP.NET. My current task is to make a website which will incorporate some jquery viewing features. This project seems to me will be primarily dealing with reading data from a database and making graphs out of them. This will require me to make custom queries from whatever the client is looking at. I think it is going to be what this guy calls an Ad Hoc Query tool My plan for this is to make it a database-driven website. So I can utilize the jquery dynamic viewing capabilities. I stumbled upon the functional programming paradigm and found F#. I read that because of it's functional programming paradigm, it makes it a good language to do asynchronous functions. I read about how you can use this with LINQ to SQL and how easy it is to make queries without actually putting the query language in. I understand the concept of the MVC design pattern. But I don't understand what they mean about C# being the front-end and F# being the back-end. Can someone clarify this to me? Also what are your thoughts about doing this project in this way? Any comments and thoughts are greatly appreciated. I feel as if learning F# will be a great learning experience for me. My guess is that the F# back-end is like the part where it controls the calls to the database. F# is possibly the model part of the design pattern. And C# is the controller. So HTML, Javascript and Jquery stuff will be my View design pattern. Clarify please?

    Read the article

  • What Pattern will solve this - fetching dependent record from database

    - by tunmise fasipe
    I have these classes class Match { int MatchID, int TeamID, //used to reference Team ... other fields } Note: Match actually have 2 teams which means 2 TeamID class Team { int TeamID, string TeamName } In my view I need to display List<Match> showing the TeamName. So I added another field class Match { int MatchID, int TeamID, //used to reference Team ... other fields string TeamName; } I can now do Match m = getMatch(id); m.TeamName = getTeamName(m.TeamId); //get name from database But for a List<Match>, getTeamName(TeamId) will go to the database to fetch TeamName for each TeamID. For a page of 10 Matches per page, that could be (10x2Teams)=20 trip to database. To avoid this, I had the idea of loading everything once, store it in memory and only lookup the TeamName in memory. This made me have a rethink that what if the records are 5000 or more. What pattern is used to solve this and how?

    Read the article

  • Retrieving database column using JSON [migrated]

    - by arokia
    I have a database consist of 4 columns (id-symbol-name-contractnumber). All 4 columns with their data are being displayed on the user interface using JSON. There is a function which is responisble to add new column to the database e.g (countrycode). The coulmn is added successfully to the database BUT not able to show the new added coulmn in the user interface. Below is my code that is displaying the columns. Can you help me? table.php $(document).ready(function () { // prepare the data var theme = getDemoTheme(); var source = { datatype: "json", datafields: [ { name: 'id' }, { name: 'symbol' }, { name: 'name' }, { name: 'contractnumber' } ], url: 'data.php', filter: function() { // update the grid and send a request to the server. $("#jqxgrid").jqxGrid('updatebounddata', 'filter'); }, cache: false }; var dataAdapter = new $.jqx.dataAdapter(source); // initialize jqxGrid $("#jqxgrid").jqxGrid( { source: dataAdapter, width: 670, theme: theme, showfilterrow: true, filterable: true, columns: [ { text: 'id', datafield: 'id', width: 200 }, { text: 'symbol', datafield: 'symbol', width: 200 }, { text: 'name', datafield: 'name', width: 100 }, { text: 'contractnumber', filtertype: 'list', datafield: 'contractnumber' } ] }); }); data.php <?php #Include the db.php file include('db.php'); $query = "SELECT * FROM pricelist"; $result = mysql_query($query) or die("SQL Error 1: " . mysql_error()); $orders = array(); // get data and store in a json array while ($row = mysql_fetch_array($result, MYSQL_ASSOC)) { $pricelist[] = array( 'id' => $row['id'], 'symbol' => $row['symbol'], 'name' => $row['name'], 'contractnumber' => $row['contractnumber'] ); } echo json_encode($pricelist); ?>

    Read the article

  • What is a Relational Database Management System (RDBMS)?

    A Relational Database Management System (RDBMS)  can also be called a traditional database that uses a Structured Query Language (SQL) to provide access to stored data while insuring the integrity of the data. The data is stored in a collection of tables that is defined by relationships between data items. In addition, data permitted to be joined in new relationships. Traditional databases primarily process data through transactions called transaction processing. Transaction processing is the methodology of grouping related business operations based predefined business events. An example of this can be seen when a person attempts to purchase an item from an online e-tailor. The business must execute specific operations for a related  business event. In this case, a business must store the following information: Customer Info, Order Info, Order Item Info, Customer Payment Data, Payment Results, and Current Order Status. Example: Pseudo SQL Operations needed for processing an online e-tailor sale. Insert Customer into Customers Insert New Order into Orders Insert Each New Order Item into OrderItems Insert Customer Payment Info into PaymentInfo Insert Payment Processing Result into PaymentDetails Update Customer for Current Order Status Common Relational Database Management System Microsoft SQL Server Microsoft Access Oracle MySQL DB2 It is important to note that no current RDBMS has fully implemented all of the Relational Principles. Common RDBMS Traits Volatile Data Supports Transaction Processing Optimized for Updates and Simple Queries 

    Read the article

  • Project Showcase: SaaS Web Apps Hits a Home Run with New SCMS Database

    - by Webgui
    We love seeing projects from start to finish, and we’re happy to share the latest example with you. Who: SaaS Web Apps – they use Software as a Service to create web applications that look and feel like desktop applications. What: SaaS Web Apps needed to build a Sports Contract Management System (SCMS) for one of its customers, Premier Stinson Sports. Why: The SCMS database is used for collecting, analyzing and recording college coach and athletic directors’ employment and contract data. The Challenge: Premier Stinson Sports works with a number of partners, each with its own needs and unique requirements. For example, USA Today uses the system to provide cutting edge news analysis while The National Sports Law Institute of Marquette University Law School uses it to for the latest sports contract data and student analysis. In addition, the system needed to be secure due to the sensitivity of the data; it was essential that the user security and permissions be easily configurable. As always, performance was a key factor, especially with the intense reporting and analytical capabilities for this project. Because of this, most of the processing had to be done on a dedicated server but the project called for the richness and responsiveness of a desktop application. The Solution: To execute the project, SaaS Web Apps used APS.Net-based Visual WebGui from Gizmox, combined with SQL Server 2008 and SQL Reporting Services. This combination resulted in a quick deployment for SaaS Web Apps’ customers. The Result: The completed project gave each partner the scalability and availability of a web application with the performance and security of a desktop application. As an example, USA Today pulls data from this database to give readers the latest sports stats – Salary analysis of 2010 Football Bowl Subdivision Coaches. And here’s a screenshot of the database itself. Great work, SaaS Web Apps!

    Read the article

  • EmblaCom Oy Maximizes Database Availability and Reduces Costs with MySQL Cluster

    - by Bertrand Matthelié
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Headquartered in Finland, EmblaCom Oy provides turnkey and cloud-hosted voice solutions to mobile operators around the globe. Since launching the original mobile private branch exchange (PBX) in 1998, the company has focused on helping its partners provide efficient voice communications to their key business customers. The company’s voice solutions are used by millions of subscribers, worldwide. EmblaCom Oy needed to replace several database engines with a standardized, scalable, development-friendly database solution to maximize availability and cut costs. The company chose MySQL Cluster Carrier Grade Edition, which has maximized accessibility to EmblaCom’s services for its clients and their hundreds of thousands of subscribers. The initiative has also reduced, by half, the cost of the database solution installation for customers, as well as lowered maintenance and customer service costs. Read the entire case study here.

    Read the article

  • ssh login successful, but scp password gives me "Permission denied"

    - by YANewb
    I'm trying to get some blogging software up on an organizational remote server. I tried to set up a SSH Key but was having problems and decided that getting the blog up and running was more important than dealing with the SSH Key issue, so I ssh-keygen -R remoteserver.com. Now I can successfully login with ssh -v [email protected] and the correct password. Once logged in I can move around and read any file and directory that I should be able to read. But when I try to edit an existing -rw-r--r-- file with VIM, it shows up as read-only, if I try to edit permissions I get chmod: file.ext: Operation not permitted, and if I try to scp a new file from my local machine I'm prompted for the remote user's password, and then get scp: /home/path/to/file.ext: Permission denied. Since I didn't have any of these problems before I tried to set up the ssh key, I suspect these anomalies are a side effect of that, but I don't know how to troubleshoot this. So what does a foolish server-newb, such as myself, need to do to get edit capability back as a remote user? Addendum 1: My userids are different between my local machine and the remote server. For ssh I ssh -v [email protected]. if I whoami I get remoteuser For scp I scp file.ext [email protected]:/path/to/file.ext from the local directory with file.ext while logged in as the local user. if I whoami I get localuser The ls -l for two different files I've tried scp: -rw-r--r--@ 1 localuser localgroup 20 Feb 11 21:03 phpinfo.php -rw-r--r-- 1 root localgroup 4 Feb 11 22:32 test.txt The ls -l for the file I've tried to VIM: -rw-r--r-- 1 remoteuser remotegroup 76 Jul 27 2009 info.txt Addendum 2: In the past I've set up ssh-keys for git repositories. I don't want to completely destroy them, so in an attempt to follow a deer's train of thinking I renamed my ~/.ssh/ to ~/.ssh-bak/, then tested the different types of access. The abridged version of the terminal commands and results is below; I think everything is working until the 8th line from the end. localcomputer:~ localuser$ ssh -v [email protected] OpenSSH_5.2p1, OpenSSL 0.9.8l 5 Nov 2009 debug1: Reading configuration data /etc/ssh_config debug1: Connecting to remoteserver.com [###.###.###.###] port 22. debug1: Connection established. debug1: identity file /Users/localuser/.ssh/identity type -1 debug1: identity file /Users/localuser/.ssh/id_rsa type -1 debug1: identity file /Users/localuser/.ssh/id_dsa type -1 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.8p2 FreeBSD-20110503 debug1: match: OpenSSH_5.8p2 FreeBSD-20110503 pat OpenSSH* debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.2 debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug1: kex: server->client aes128-ctr hmac-md5 none debug1: kex: client->server aes128-ctr hmac-md5 none debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY The authenticity of host 'remoteserver.com (###.###.###.###)' can't be established. RSA key fingerprint is ##:##:##:##:##:##:##:##:##:##:##:##:##:##:##:##. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added 'remoteserver.com,###.###.###.###' (RSA) to the list of known hosts. debug1: ssh_rsa_verify: signature correct debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey,password debug1: Next authentication method: publickey debug1: Trying private key: /Users/localuser/.ssh/identity debug1: Trying private key: /Users/localuser/.ssh/id_rsa debug1: Trying private key: /Users/localuser/.ssh/id_dsa debug1: Next authentication method: password [email protected]'s password: debug1: Authentication succeeded (password). debug1: channel 0: new [client-session] debug1: Requesting [email protected] debug1: Entering interactive session. Last login: Sun Feb 12 18:00:54 2012 from 68.69.164.123 FreeBSD 6.4-RELEASE-p8 (VKERN) #1 r101746: Mon Aug 30 10:34:40 MDT 2010 [remoteuser@remoteserver /home]$ ls -l total ### -rw-r--r-- 1 remoteuser remotegroup 76 Aug 12 2009 info.txt [remoteuser@remoteserver /home]$ vim info.txt ~ {at the bottom of the VIM screen it tells me it's [read only]} [remoteuser@remoteserver /home]$ whoami remoteuser [remoteuser@remoteserver /home]$ logout debug1: client_input_channel_req: channel 0 rtype exit-status reply 0 debug1: client_input_channel_req: channel 0 rtype [email protected] reply 0 debug1: channel 0: free: client-session, nchannels 1 Connection to remoteserver.com closed. Transferred: sent 3872, received 12496 bytes, in 107.4 seconds Bytes per second: sent 36.1, received 116.4 debug1: Exit status 0 localcomputer:localdirectory name$ scp -v phpinfo.php [email protected]:/home/www/remotedirectory/phpinfo.php Executing: program /usr/bin/ssh host remoteserver.com, user remoteuser, command scp -v -t /home/www/remotedirectory/phpinfo.php OpenSSH_5.2p1, OpenSSL 0.9.8l 5 Nov 2009 debug1: Reading configuration data /etc/ssh_config debug1: Connecting to remoteserver.com [###.###.###.###] port 22. debug1: Connection established. debug1: identity file /Users/localuser/.ssh/identity type -1 debug1: identity file /Users/localuser/.ssh/id_rsa type -1 debug1: identity file /Users/localuser/.ssh/id_dsa type -1 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.8p2 FreeBSD-20110503 debug1: match: OpenSSH_5.8p2 FreeBSD-20110503 pat OpenSSH* debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.2 debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug1: kex: server->client aes128-ctr hmac-md5 none debug1: kex: client->server aes128-ctr hmac-md5 none debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY debug1: Host 'remoteserver.com' is known and matches the RSA host key. debug1: Found key in /Users/localuser/.ssh/known_hosts:1 debug1: ssh_rsa_verify: signature correct debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey,password debug1: Next authentication method: publickey debug1: Trying private key: /Users/localuser/.ssh/identity debug1: Trying private key: /Users/localuser/.ssh/id_rsa debug1: Trying private key: /Users/localuser/.ssh/id_dsa debug1: Next authentication method: password [email protected]'s password: debug1: Authentication succeeded (password). debug1: channel 0: new [client-session] debug1: Requesting [email protected] debug1: Entering interactive session. debug1: Sending command: scp -v -t /home/www/remotedirectory/phpinfo.php Sending file modes: C0644 20 phpinfo.php Sink: C0644 20 phpinfo.php scp: /home/www/remotedirectory/phpinfo.php: Permission denied debug1: client_input_channel_req: channel 0 rtype exit-status reply 0 debug1: channel 0: free: client-session, nchannels 1 debug1: fd 0 clearing O_NONBLOCK debug1: fd 1 clearing O_NONBLOCK Transferred: sent 1456, received 2160 bytes, in 0.6 seconds Bytes per second: sent 2322.3, received 3445.1 debug1: Exit status 1

    Read the article

  • How to export SQL Server data from corrupted database (with disk write error)

    - by damitamit
    IT realised there was a disk write error on our production SQL Server 2005 and hence was causing the backups to fail. By the time they had realised this the nightly backup was old, so were not able to just restore the backup on another server. The database is still running and being used constantly. However DBCC CheckDB fails. Also the SQL Server backup task fails, Copy Database fails, Export Data Wizard fails. However it seems all the data can be read from the tables (i.e using bcp etc) Another observation I have made is that the Transaction Log is nearly double the size of the Database. (Does that mean all the changes arent being written to the MDF?) What would be the best plan of attack to get the database to a state where backups are working and the data is safe? Take the database offline and use the MDF/LDF to somehow create the database on another sql server? Export the data from the database using bcp. Create the database (use the Generate Scripts function on the corrupt db to create the schema on the new db) on another sql server and use bcp again to import the data. Some other option that is the right course of action in this situation? The IT manager says the data is safe as if the server fails, the data can be restored from the mdf/ldf. I'm not sure so insisted that we start exporting the data each night as a failsafe (using bcp for example). IT are also having issues on the hardware side of things as supposedly the disk error in on a virtualized disk and can't be rebuilt like a normal raid array (or something like that). Please excuse my use of incorrect terminology and incorrect assumptions on how Sql Server operates. I'm the application developer and have been called to help (as it seems IT know less about SQL Server than I do). Many Thanks, Amit

    Read the article

  • Moving a Drupal between linux servers, best practice to avoid file-ownership problems

    - by zero
    I want to port over a Drupal commons 6x24 from a local LAMP-stack to a production webserver. Both systems run OpenSuse Linux. How do I do this, what are the most important steps. How should I handle file-ownership. It's important for me to have to have full control of the file ownership. If I use the wwwrun account, I frequently run into problems, due to a very strict webserver-admin. See for example the long history of looking for fixes and solutions see this thread and even more interesting see this very long and impressive thread here. All troubles I run into have to do with file-owernship and permissions. This is my current setup; Note: This was just a quick hacked installation - quick and dirty. Well my interest is after the general options i have in the port of a drupal from linux to linux linux-vi17:/srv/www/htdocs/com624 # ls -l insgesamt 224 -rwxrwxrwx 1 root www 45285 19. Jan 00:54 CHANGELOG.txt -rwxrwxrwx 1 root www 925 19. Jan 00:54 COPYRIGHT.txt -rwxrwxrwx 1 root www 206 19. Jan 00:54 cron.php drwxrwxrwx 2 root www 4096 19. Jan 00:54 includes -rwxrwxrwx 1 root www 923 19. Jan 00:54 index.php -rwxrwxrwx 1 root www 1244 19. Jan 00:54 INSTALL.mysql.txt -rwxrwxrwx 1 root www 1011 19. Jan 00:54 INSTALL.pgsql.txt -rwxrwxrwx 1 root www 47073 19. Jan 00:54 install.php -rwxrwxrwx 1 root www 15572 19. Jan 00:54 INSTALL.txt -rwxrwxrwx 1 root www 14940 19. Jan 00:54 LICENSE.txt -rwxrwxrwx 1 root www 1858 19. Jan 00:54 MAINTAINERS.txt drwxrwxrwx 3 root www 4096 19. Jan 00:54 misc drwxrwxrwx 35 root www 4096 19. Jan 00:54 modules drwxrwxrwx 4 root www 4096 19. Jan 00:54 profiles -rwxrwxrwx 1 root www 1470 19. Jan 00:54 robots.txt drwxrwxrwx 2 root www 4096 19. Jan 00:54 scripts drwxrwxrwx 4 root www 4096 19. Jan 00:54 sites drwxrwxrwx 7 root www 4096 19. Jan 00:54 themes -rwxrwxrwx 1 root www 26250 19. Jan 00:54 update.php -rwxrwxrwx 1 root www 4864 19. Jan 00:54 UPGRADE.txt -rwxrwxrwx 1 root www 294 19. Jan 00:54 xmlrpc.php linux-vi17:/srv/www/htdocs/com624 # thx to BetaRides answer here a quick overview on the drush functionality with rsync http://drush.ws/ core-rsync Rsync the Drupal tree to/from another server using ssh. Examples: drush rsync @dev @stage Rsync Drupal root from dev to stage (one of which must be local). drush rsync ./ @stage:%files/img Rsync all files in the current directory to the 'img' directory in the file storage folder on stage. Arguments: source May be rsync path or site alias. See rsync documentation and example.aliases.drushrc.php. destination May be rsync path or site alias. See rsync documentation and example.aliases.drushrc.php. Options: --mode The unary flags to pass to rsync; --mode=rultz implies rsync -rultz. Default is -az. --RSYNC-FLAG Most rsync flags passed to drush sync will be passed on to rsync. See rsync documentation. --exclude-conf Excludes settings.php from being rsynced. Default. --include-conf Allow settings.php to be rsynced --exclude-files Exclude the files directory. --exclude-sites Exclude all directories in "sites/" except for "sites/all". --exclude-other-sites Exclude all directories in "sites/" except for "sites/all" and the site directory for the site being synced. Note: if the site directory is different between the source and destination, use --exclude-sites followed by "drush rsync @from:%site @to:%site" --exclude-paths List of paths to exclude, seperated by : (Unix-based systems) or ; (Windows). --include-paths List of paths to include, seperated by : (Unix-based systems) or ; (Windows). Topics: docs-aliases Site aliases overview with examples Aliases: rsync

    Read the article

  • Require extended permissions in FBML pyfacebook app

    - by jlpp
    I'm trying to get my FBML canvas page to automatically prompt new app users for permission to publish_stream. Following Facebook's documentation I tried using the required_permissions argument to require_login. That is, I tried to use the pyfacebook require_login decorator like this: @facebook.require_login(required_permissions='publish_stream') as in: @decorator_from_middleware(FacebookMiddleware) @facebook.require_login(required_permissions='publish_stream') def canvas(request, template): ... Requesting extended permissions in a pyfacebook-based Facebook iFrame app has been discussed. Requesting extended permissions in an FBML app too. My objective is to require extended permissions in an FBML app. Am I missing something or can anyone suggest a workaround? Thanks.

    Read the article

  • SharePoint Permissions

    - by Greg
    I have a custom workflow. This workflow removes permissions to items when an item is added (example an item is added by a service account and once added those permissions need to be removed from that item). This works as I have the service account 'hard coded' in the custom workflow. Now I would like to remove this hard coding and when a item is added to a list I would like to iterate through all users that have access to the list item. If a user matches some algorithm then remove that user from the item permissions which will be 0 to many. The piece I'm stuggling with is how to iterage all users with permission to a SPListItem. Any thoughts on how to accomplish this? Thanks in advance!

    Read the article

< Previous Page | 133 134 135 136 137 138 139 140 141 142 143 144  | Next Page >