Search Results

Search found 52602 results on 2105 pages for 'amazon web services'.

Page 261/2105 | < Previous Page | 257 258 259 260 261 262 263 264 265 266 267 268  | Next Page >

  • SSAS Multithreaded sync with Windows 2008 R2

    - by ACALVETT
    We have been happily running some of our systems on WIndows 2003 and have had an upgrade to W2K8 R2 on the list for quite some time. The upgrade has now completed and we can start taking advantage of some of the new features which is the reason for this post. For a long time we have used the sample Robocopy script from the SQLCat team to synchronize some of our larger SSAS databases. If your wondering what i mean by large, around 5 TB with a good few thousand partitions. The script works like a dream...(read more)

    Read the article

  • T-SQL Tuesday #005: Creating SSMS Custom Reports

    - by Mike C
    This is my contribution to the T-SQL Tuesday blog party, started by Adam Machanic and hosted this month by Aaron Nelson . Aaron announced this month's topic is "reporting" so I figured I'd throw a blog up on a reporting topic I've been interested in for a while -- namely creating custom reports in SSMS. Creating SSMS custom reports isn't difficult, but like most technical work it's very detailed with a lot of little steps involved. So this post is a little longer than usual and includes a lot of...(read more)

    Read the article

  • T-SQL Tuesday #005: Creating SSMS Custom Reports

    - by Mike C
    This is my contribution to the T-SQL Tuesday blog party, started by Adam Machanic and hosted this month by Aaron Nelson . Aaron announced this month's topic is "reporting" so I figured I'd throw a blog up on a reporting topic I've been interested in for a while -- namely creating custom reports in SSMS. Creating SSMS custom reports isn't difficult, but like most technical work it's very detailed with a lot of little steps involved. So this post is a little longer than usual and includes a lot of...(read more)

    Read the article

  • Hosting StreamInsight applications using WCF

    - by gsusx
    One of the fundamental differentiators of Microsoft's StreamInsight compared to other Complex Event Processing (CEP) technologies is its flexible deployment model. In that sense, a StreamInsight solution can be hosted within an application or as a server component. This duality contrasts with most of the popular CEP frameworks in the current market which are almost exclusively server based. Whether it's undoubtedly that the ability of embedding a CEP engine in your applications opens new possibilities...(read more)

    Read the article

  • Enabling OUD Entry Cache for large static groups

    - by Sylvain Duloutre
    Oracle Unified Directory can take advantage of several caches to improve performances. especially the so-called database cache and the file system cache. In addition to that, it is possible to use an entry cache to cache LDAP entries. By default, the entry cache is not used. In specific deployements involving large static groups, it may worth loading the group entries to the entry cache to speed up group membership and group-based aci evaluation. To do so, run the following commands: First, specify which entries should reside in the entry cache. In the commad below, only entries matching the LDAP filter " (|(objctclass=groupOfNames)(objectclass=groupOfUniqueNames)) " will be stored in the entry cache. dsconfig set-entry-cache-prop \          --cache-name FIFO \          --add include-filter:\(\|\(objctclass=groupOfNames\)\(objectclass=groupOfUniqueNames\)\)          --port <ADMIN_PORT> \          --bindDN cn=Directory\ Manager \          --bindPassword ****** \          --no-prompt Then enable the entry cache: dsconfig set-entry-cache-prop \          --cache-name FIFO \          --set enabled:true \          --port <ADMIN_PORT> \          --bindDN cn=Directory\ Manager \          --bindPassword ****** \          --no-prompt In addition to that, you can control how much memory the entry cache can use: oud@s96sec1d0-v3:/application/oud : dsconfig -X -n -p <ADMIN PORT> -D "cn=Directory Manager" -w <password> get-entry-cache-prop --cache-name FIFO Property           : Value(s) -------------------:----------------------------------------------------------- cache-level        : 1 enabled            : true exclude-filter     : - include-filter     : (|(objctclass=groupOfNames)(objectclass=groupOfUniqueNames)) max-entries        : 2147483647 max-memory-percent : 90 You can change the max-entries amd max-memory-percent properties to control the entry cache size using the dsconfig set-entry-cache-prop command.

    Read the article

  • Simple VIN API for decoding Vehicle Identification Numbers

    - by kerry
    Ever see a nifty tool that solves a problem for a particular domain that you may never encounter but wish you had a reason to use it? I did that recently with this VIN API by the people at the PullMonkey blog.  It will easily decode a VIN, returning the make, model, year, and other useful information about the vehicle.  It was developed for Mr Quotey, a new free online insurance service. Check out the post if you would like to try it out, it even provides a simple example written in Ruby.

    Read the article

  • IDC Analyst Mike Fauscette Writes About Oracle And The Cloud

    - by Roxana Babiciu
    "It's becoming clear that cloud is now a core part of Oracle's strategy," says analyst Michael Fauscette in his post-OpenWorld article in Seeking Alpha. He believes we have a well-rounded portfolio "with a cloud platform/infrastructure, a broad selection of apps, and a partner marketplace." From his numerous conversations with customers, he highlights their continual interest in hybrid deployments and also in shifting apps to the cloud. Read more.

    Read the article

  • Re-running SSRS subscription jobs that have failed

    - by Rob Farley
    Sometimes, an SSRS subscription for some reason. It can be annoying, particularly as the appropriate response can be hard to see immediately. There may be a long list of jobs that failed one morning if a Mail Server is down, and trying to work out a way of running each one again can be painful. It’s almost an argument for using shared schedules a lot, but the problem with this is that there are bound to be other things on that shared schedule that you wouldn’t want to be re-run. Luckily, there’s a table in the ReportServer database called dbo.Subscriptions, which is where LastStatus of the Subscription is stored. Having found the subscriptions that you’re interested in, finding the SQL Agent Jobs that correspond to them can be frustrating. Luckily, the jobstep command contains the subscriptionid, so it’s possible to look them up based on that. And of course, once the jobs have been found, they can be executed easily enough. In this example, I produce a list of the commands to run the jobs. I can copy the results out and execute them. select 'exec sp_start_job @job_name = ''' + cast(j.name as varchar(40)) + '''' from msdb.dbo.sysjobs j  join  msdb.dbo.sysjobsteps js on js.job_id = j.job_id join  [ReportServer].[dbo].[Subscriptions] s  on js.command like '%' + cast(s.subscriptionid as varchar(40)) + '%' where s.LastStatus like 'Failure sending mail%'; Another option could be to return the job step commands directly (js.command in this query), but my preference is to run the job that contains the step. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Can’t use Sub Reports and Shared Datasets in BIDS - Data Retrieval failed for the subreport '###', located at ### Please check the log files

    - by simonsabin
    SQL Server 2008 R2 brings us a great new feature of shared data sets. This allows datasets used by parameters and multiple reports to be defined in one place and used in multiple reports. I’ve been developing a report for SQLBits that required use of sub reports. They all use a shared data set for the agenda items. Initially I was developing in report builder but have now gone back to BIDS. In doing this however all the reports bust reporting the following error Data Retrieval failed for the subreport...(read more)

    Read the article

  • Optimize SUMMARIZE with ADDCOLUMNS in Dax #ssas #tabular #dax #powerpivot

    - by Marco Russo (SQLBI)
    If you started using DAX as a query language, you might have encountered some performance issues by using SUMMARIZE. The problem is related to the calculation you put in the SUMMARIZE, by adding what are called extension columns, which compute their value within a filter context defined by the rows considered in the group that the SUMMARIZE uses to produce each row in the output. Most of the time, for simple table expressions used in the first parameter of SUMMARIZE, you can optimize performance by removing the extended columns from the SUMMARIZE and adding them by using an ADDCOLUMNS function. In practice, instead of writing SUMMARIZE( <table>, <group_by_column>, <column_name>, <expression> ) you can write: ADDCOLUMNS(     SUMMARIZE( <table>, <group by column> ),     <column_name>, CALCULATE( <expression> ) ) The performance difference might be huge (orders of magnitude) but this optimization might produce a different semantic and in these cases it should not be used. A longer discussion of this topic is included in my Best Practices Using SUMMARIZE and ADDCOLUMNS article on SQLBI, which also include several details about the DAX syntax with extended columns. For example, did you know that you can create an extended column in SUMMARIZE and ADDCOLUMNS with the same name of existing measures? It is *not* a good thing to do, and by reading the article you will discover why. Enjoy DAX!

    Read the article

  • SSIS Expression Tester Tool

    - by Davide Mauri
    Thanks to my friend's Doug blog I’ve found a very nice tool made by fellow MVP Darren Green which really helps to make SSIS develoepers life easier: http://expressioneditor.codeplex.com/Wikipage?ProjectName=expressioneditor In brief the tool allow the testing of SSIS Expression so that one can evaluate and test them before using in SSIS packages. Cool and useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Vizioz is back on the web

    - by Vizioz Limited
    I founded Vizioz in 1996 as a website development business. Since then Vizioz has come in and out of focus, it is time to start building Vizioz into a website development consultancy. Over the following two months we will be launching our new website where we will be promoting our ASP.NET, C#, XHTML, CSS, XSLT, SQL, Umbraco Development, Wordpress Development and Graphic Design services, and of course our website will be powered by the open source content management system called Umbraco CMS which we have used for 10+ client sites in the last twelve months.I will be using this new blog to talk about client projects we are currently working on and any Umbraco development related bits of code that we feel would be good to release back to the Umbraco community.

    Read the article

  • Methodology behind fetching large XML data sets in pieces

    - by Jerry Dodge
    I am working on an HTTP Server in Delphi which simply sends back a custom XML dataset. I am not following any type of standard formatting, such as SOAP. I have the system working seamlessly, except one small flaw: When I have a very large dataset to send back to the client, it might take up to 2 minutes for all the data to be transferred. The HTTP Server I'm building is essentially an XML Data based API around a database, implementing the common business rule - therefore, the requests are specific to the data behind the system. When, for example, I fetch a large set of product data, I would like to break this down and send it back piece by piece. However, a single HTTP request calls for a single response. I can't necessarily keep feeding the client with multiple different XML packets unless the client explicitly requests it. I don't have any session management, but rather an API Key. I know if I had sessions, I could keep-alive a dataset temporarily for a client, and they could request bits and pieces of it. However, without session management, I would have to execute the SQL query multiple times (for each chunk of data), and in the mean-time, if that data changes, the "pages" might get messed up, therefore causing items to show on the wrong pages, after navigating to a different page. So how is this commonly handled? What's the methodology behind breaking down a large XML dataset into chunks to save the load?

    Read the article

  • Implement Budget Allocation in DAX for Power Pivot and Tabular #powerpivot #tabular #ssas #dax

    - by Marco Russo (SQLBI)
    Comparing sales and budget, or costs and budget, is a very common operation. However, it is often the case that you have different granularities for different tables containing budget and the data to compare with. There are two ways to do that: you can limit the comparison to the granularity that is common to the two tables, or you can allocate the budget where it’s not defined. For example, if you have a budget defined by quarter and category, you might want to allocate it by month and product. In this way, you will do the comparison as you had a more granular definition of the budget, without actually having to do the manual job of allocating data (usually in an Excel worksheet!). If you want to do budget allocation in DAX, you can use the Budget Patterns we published on DAX Patterns. If you come from and MDX/OLAP background, at first you might find it hard to solve the problem of not having attribute hierarchies that helps you in propagating the budget values to lower hierarchical levels. However, I think that once you get used to DAX, you will find the behavior very predictable and easy to “debug” also for more complex allocation formula. You just have to be careful in writing the DAX formula, but probably the pattern we wrote should help you designing the right data model, without creating physical relationships to the budget table! This pattern is also based on the Handling Different Granularities scenario I discussed a couple of weeks ago.

    Read the article

  • Vermeidung von SOA Anti-Patterns mittels AIA

    - by Hans Viehmann
    Gerade ist mir ein White Paper des Enterprise Architecture Teams in die Hände gefallen, das sich mit SOA Anti-Patterns befasst. Es ist zwar kein AIA Paper im eigentlichen Sinne, aber mit AIA hat man natürlich eine gute Unterstützung darin, die dort beschriebenen Fehler zu vermeiden. Das White Paper behandelt Themen wie: Vermeidung von SOA Silos SOA Reifegrad und Projekt-Management Ausuferndes Service Portfolio Umgang mit Referenz-Architekturen EAI 2.0 - Punkt-zu-Punkt Integration auf offenen Standards Ein Link auf das Dokument ist unten angefügt - viel Vergnügen bei der Lektüre ... Oracle White Paper: SOA Anti-Patterns.

    Read the article

  • SSAS Native v .net Provider

    - by ACALVETT
    Recently I was investigating why a new server which is in its parallel running phase was taking significantly longer to process the daily data than the server its due to replace. The server has SQL & SSAS installed so the problem was not likely to be in the network transfer as its using shared memory. As i dug around the SQL dmv’s i noticed in sys.dm_exec_connections that the SSAS connection had a packet size of 8000 bytes instead of the usual 4096 bytes and from there i found that the datasource...(read more)

    Read the article

  • Windows Server AppFabric Beta 2 Refresh for Visual Studio 2010/.NET 4 RTM

    - by The Official Microsoft IIS Site
    Today we are pleased to announce a Beta 2 Refresh for Windows Server AppFabric. This build supports the recently released .NET Framework 4 and Visual Studio 2010 RTM versions—a request we’ve had from a number of you. Organizations wanting to use Windows Server AppFabric with the final RTM versions of .NET 4 and Visual Studio 2010 are encouraged to download the Beta 2 Refresh today. Please click here for an installation guide on installing the Beta 2 Refresh. We encourage developers and IT professionals...(read more)

    Read the article

  • New licensing for SQL Server 2012 and #BISM #Tabular usage

    - by Marco Russo (SQLBI)
    Last week Microsoft announced a new licensing schema for SQL Server 2012. If you are interested in an extensive discussion of the new licensing scheme, Denny Cherry wrote a great blog post about that. I’d like to comment about the new BI Edition license. Teo Lachev already commented about the numbers and I agree with him. I generally like the new licensing mode of SQL 2012. It maintains a very low-entry barrier for SSRS/SSAS/SSIS (Standard Edition). It has a reasonable licensing schema for 20-50...(read more)

    Read the article

  • codesniffer command not being recognized after several installs and upgrades

    - by numerical25
    I've tried to install codesniffer using pear but my mac is not recognizing the phpcs command. pear config Configuration (channel pear.php.net): ===================================== Auto-discover new Channels auto_discover 1 Default Channel default_channel pear.php.net HTTP Proxy Server Address http_proxy <not set> PEAR server [DEPRECATED] master_server pear.php.net Default Channel Mirror preferred_mirror pear.php.net Remote Configuration File remote_config <not set> PEAR executables directory bin_dir /usr/local/pear/bin PEAR documentation directory doc_dir /usr/local/pear/docs PHP extension directory ext_dir /opt/local/lib/php/extensions/no-debug-non-zts-20090626 PEAR directory php_dir /usr/local/pear/share/pear PEAR Installer cache directory cache_dir /private/tmp/pear/cache PEAR configuration file cfg_dir /usr/local/pear/cfg directory PEAR data directory data_dir /usr/local/pear/data PEAR Installer download download_dir /tmp/pear/install directory PHP CLI/CGI binary php_bin /opt/local/bin/php php.ini location php_ini /opt/local/etc/php5/php.ini-development --program-prefix passed to php_prefix <not set> PHP's ./configure --program-suffix passed to php_suffix <not set> PHP's ./configure PEAR Installer temp directory temp_dir /tmp/pear/install PEAR test directory test_dir /usr/local/pear/tests PEAR www files directory www_dir /usr/local/pear/www Cache TimeToLive cache_ttl 3600 Preferred Package State preferred_state stable Unix file mask umask 22 Debug Log Level verbose 1 PEAR password (for password <not set> maintainers) Signature Handling Program sig_bin /usr/local/bin/gpg Signature Key Directory sig_keydir /opt/local/etc/pearkeys Signature Key Id sig_keyid <not set> Package Signature Type sig_type gpg PEAR username (for username <not set> maintainers) User Configuration File Filename /Users/anthonygordon/.pearrc System Configuration File Filename /opt/local/etc/pear.conf i checked php_bin and the php executable is there. when i run phpcs i get command not found Ive tried to upgrade pear, uninstall reinstall code sniffer, everything. when i run installs list i get Pear List Package Version State Archive_Tar 1.3.10 stable Console_Getopt 1.3.1 stable PEAR 1.9.4 stable PHP_CodeSniffer 1.4.0 stable Structures_Graph 1.0.4 stable XML_Util 1.2.1 stable

    Read the article

  • SSIS Expression Tester Tool

    - by Davide Mauri
    Thanks to my friend's Doug blog I’ve found a very nice tool made by fellow MVP Darren Green which really helps to make SSIS develoepers life easier: http://expressioneditor.codeplex.com/Wikipage?ProjectName=expressioneditor In brief the tool allow the testing of SSIS Expression so that one can evaluate and test them before using in SSIS packages. Cool and useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • DTLoggedExec 1.0 Stable Released!

    - by Davide Mauri
    After serveral years of development I’ve finally released the first non-beta version of DTLoggedExec! I’m now very confident that the product is stable and solid and has all the feature that are important to have (at least for me). DTLoggedExec 1.0 http://dtloggedexec.codeplex.com/releases/view/44689 Here’s the release notes: FIRST NON-BETA RELEASE! :) Code cleaned up Added SetPackageInfo method to ILogProvider interface to make easier future improvements Deprecated the arguments 'ProfileDataFlow', 'ProfilePath', 'ProfileFileName' Added the new argument 'ProfileDataFlowFileName' that replaces the old 'ProfileDataFlow', 'ProfilePath', 'ProfileFileName' arguments Updated database scripts to support new reports Split releases in three different packages for easier maintenance and updates: DTLoggedExec Executable, Samples & Reports Fixed Issue #25738 (http://dtloggedexec.codeplex.com/WorkItem/View.aspx?WorkItemId=25738) Fixed Issue #26479 (http://dtloggedexec.codeplex.com/WorkItem/View.aspx?WorkItemId=26479) To make things easier to maintain I’ve divided the original package in three different releases. One is the DTLoggedExec executable; samples and reports are now available in separate packages so that I can update them more frequently without having to touch the engine. Source code of everything is available through Source Code Control: http://dtloggedexec.codeplex.com/SourceControl/list/changesets As usual, comments and feebacks are more than welcome! (Just use Codeplex, please, so it will be easier for me to keep track of requests and issues) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Unable to configure a service to run at startup with update-rc.d

    - by ujjain
    I would like to have transmission-daemon and vnstat automatically run at startup. I was able to configure this for apache2 and proftpd with exactly the same commands. 795 sudo update-rc.d transmission-daemon remove 797 sudo update-rc.d -f transmission-daemon remove 798 sudo update-rc.d transmission-daemon defaults 799 sudo update-rc.d vnstat remove 800 sudo update-rc.d -f vnstat remove 801 sudo update-rc.d -f vnstat defaults 802 sudo update-rc.d -f vnstat enable 805 reboot 807 history root@htpc:/home/administrator# sudo update-rc.d -f transmission-daemon remove Removing any system startup links for /etc/init.d/transmission-daemon ... /etc/rc0.d/K20transmission-daemon /etc/rc1.d/K20transmission-daemon /etc/rc2.d/S20transmission-daemon /etc/rc3.d/S20transmission-daemon /etc/rc4.d/S20transmission-daemon /etc/rc5.d/S20transmission-daemon /etc/rc6.d/K20transmission-daemon root@htpc:/home/administrator# sudo update-rc.d -f transmission-daemon defaults Adding system startup for /etc/init.d/transmission-daemon ... /etc/rc0.d/K20transmission-daemon -> ../init.d/transmission-daemon /etc/rc1.d/K20transmission-daemon -> ../init.d/transmission-daemon /etc/rc6.d/K20transmission-daemon -> ../init.d/transmission-daemon /etc/rc2.d/S20transmission-daemon -> ../init.d/transmission-daemon /etc/rc3.d/S20transmission-daemon -> ../init.d/transmission-daemon /etc/rc4.d/S20transmission-daemon -> ../init.d/transmission-daemon /etc/rc5.d/S20transmission-daemon -> ../init.d/transmission-daemon root@htpc:/home/administrator# service vnstat status * vnStat daemon is not running root@htpc:/home/administrator# service transmission-daemon status * transmission-daemon is not running root@htpc:/home/administrator# service transmission-daemon start * Starting bittorrent daemon transmission-daemon [ OK ] root@htpc:/home/administrator# service vnstat start * Starting vnStat daemon vnstatd [ OK ] root@htpc:/home/administrator# service apache2 status Apache2 is running (pid 1137). root@htpc:/home/administrator#

    Read the article

  • Setting up SANED with sane-test

    - by Enoon
    I'm trying to setup a network service for running saned (running the sane-test backend) on Ubuntu 12.10, running it on a virtualbox. I followed the directions found here: https://help.ubuntu.com/community/ScanningHowTo and i got to the point where if i use a front-end like xsane, or use the command scanimage -d test i get the desired behaviour. (i get the test image). But when i try to use the network demon to access the backend (from the localhost) i get Failed to connect. I used telnet on 127.0.0.1 6566 and i got the following error: saned: symbol lookup error: saned: undefined symbol: sanei_w_init I'm a linux newbie, so i could be making some stupid mistake. Any ideas on how to fix this? Thanks [update]After posting this question i logged out, logged back in and it worked. After a few days i tried again and it gave the same error as before. Any help?

    Read the article

  • Here we go again - quest for web hosted forum via javascript

    - by jim
    Hello all, disclaimer If this is the wrong location for this question, then please advise me accordingly. backgound I've been using Disqus and intense debate as a 'comments' service for a variety of my sites to great effect and love the fact that i get alot of the facebook/twitter integration 'for free', as well as the SEO benefits. request To this end, does anyone out there know of similar services that can be used to pull entire forums/threaded discussions into the app in a similar fashion (i.e. via ajax webservices). google has been at a loss to turn anything up on this front and i'm therefore wondeing if it's unlikely that such a 'service' exists. respect hope this stikes a chord out there... btw - altho using this in asp.net mvc, I'm aware that this technology could be used on any platform capable of consuming javascript via ajax, thus the wide spread of 'tags'.

    Read the article

< Previous Page | 257 258 259 260 261 262 263 264 265 266 267 268  | Next Page >