Search Results

Search found 80334 results on 3214 pages for 'file replication services'.

Page 12/3214 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • batch file infinite loop when parsing file

    - by Bart
    Okay, this should be a really simple task but its proving to be more complicated than I think it should be. I'm clearly doing something wrong, and would like someone else's input. What I would like to do is parse through a file containing paths to directories and set permissions on those directories. An example line of the input file. There are several lines, all formatted the same way, with a different path to a directory. E:\stuff\Things\something else (X)\ (The file in question is generated under Cygwin using find to list all directories with "(X)" in the name. The file is then passed through unix2win to make it windows compatible. I've also tried manually creating the input file from within windows to rule out the file's creation method as the problem.) Here's where I'm stuck... I wrote the following quick and dirty batch file in Windows XP and it worked without any issues at all, but it will not work in server 2k8. Batch file code to run through the file and set permissions: FOR /F "tokens=*" %%A IN (dirlist.txt) DO echo y| cacls "%%A" /T /C /G "Domain Admins":f "Some Group":f "some-security-group":f What this is SUPPOSED to do (and does in XP) is loop through the specified file (dirlist.txt) and run cacls.exe on each directory it pulls from the file. The "echo y|" is in there to automagically confirm when cacls helpfully asks "are you sure?" for every directory in the list. Unfortunately, however, what it DOES is fall into an infinite loop. I've tried surrounding everything after "DO" with quotes, which prevents the endless loop but confuses cacls so it throws an error. Interestingly, I've tried running the code from after "DO" manually (obviously replacing the variable with the full path, copied straight from the file) at a command prompt and it runs as expected. I don't think it's the file or the loop, as adding quotes to the command to be executed prevents the loop from continuing past where it's supposed to... I really have no idea at this point. Any help would be appreciated. I have a feeling it's going to be something increadibly stupid... but I'm pulling my hair out so I thought I'd ask.

    Read the article

  • Microsoft Management Console stops working when I add snap-in to it

    - by JayaprakashReddy
    I have Windows 7 Ultimate OS. I'm opening mmc.exe as administrator and trying add Certificates or any other snap-in, then while loading that snap-in MMC breaks and displays following message and after that it closes automatically once I click on close button on that message. What could be the problem? I did following to fix the problem but couldn't succeed any of these: I tried to repair the OS I repaired files using this method Even repaired the installation using this link Update: *@oldskool: Here is the debug process output:* Sorry its a long output text. 'mmc.exe': Loaded 'C:\Windows\System32\mmc.exe', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ntdll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\kernel32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\KernelBase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\gdi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\user32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\lpk.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\usp10.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msvcrt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mfc42u.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ole32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\rpcrt4.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\oleaut32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\odbc32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\advapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\sechost.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mmcbase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\shlwapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\uxtheme.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\duser.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\imm32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msctf.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\odbcint.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\dui70.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_6.0.7600.16661_none_420fe3fa2b8113bd\comctl32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\shell32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptbase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\urlmon.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wininet.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\iertutil.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\crypt32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msasn1.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\clbcatq.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mmcndmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dwmapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\oleacc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptsp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\rsaenh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\RpcRtRemote.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mlang.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\xmllite.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\version.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\apphelp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\mscormmc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mscoree.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.4927_none_d08a205e442db5b5\msvcr80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\azroleui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\atl.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\secur32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\netutils.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dsrole.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\logoncli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dsuiext.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ntdsapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ws2_32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\nsi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\activeds.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\adsldpc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\Wldap32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mpr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\netapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\srvcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wkscli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\certmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\certcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\CertEnroll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ncrypt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\bcrypt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wintrust.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\imagehlp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\sspicli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\aclui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\IPHLPAPI.DLL', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winnsi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\slc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\comsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mfc42.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mycomput.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\setupapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cfgmgr32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devobj.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devrtl.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\newdev.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmdskmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmutil.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmdskres.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\dmdskres2.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\gpedit.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dssec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\authz.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dfscli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\samcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\gpapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\framedynos.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wtsapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ipsmsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winipsec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\userenv.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\profapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ipsecsnp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\polstore.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\localsec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wdc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pdh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pdhui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\comdlg32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\credui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wevtapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pla.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\tdh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winsta.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\utildll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\browcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\vdmdbg.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pmcsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winspool.drv', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\puiapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wsecedit.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\scecli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\filemgmt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\100\Tools\Binn\SqlManager.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.mfc_1fc8b3b9a1e18e3b_8.0.50727.4053_none_cbf21254470d8752\mfc80u.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.4927_none_d08a205e442db5b5\msvcp80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.atl_1fc8b3b9a1e18e3b_8.0.50727.4053_none_d1c738ec43578ea1\ATL80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_5.82.7600.16661_none_ebfb56996c72aefc\comctl32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.mfcloc_1fc8b3b9a1e18e3b_8.0.50727.4053_none_03ca5532205cb096\mfc80ENU.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\100\Tools\Binn\Resources\1033\SqlManager.rll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\msxml6.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SqlManager.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wbem\wbemcntl.dll', Cannot find or open the PDB file The thread 'Win32 Thread' (0xf74) has exited with code 0 (0x0). Unhandled exception at 0x774d35e3 in mmc.exe: 0xC0000374: A heap has been corrupted.

    Read the article

  • Problem while Installing mysql cluster

    - by Champion
    I downloaded latest mysql cluster file 64 bit (mysql-cluster-gpl-7.1.9-linux-x86_64-glibc23) from mysql site. Ran following command to install mysql server. /home/mysql_cluster/mysqlc/scripts/mysql_install_db --no-defaults --basedir=/home/mysql_cluster/mysqlc --datadir=/home/mysql_cluster/my_cluster/mysqld_data It's giving following issues. /home/mysql_cluster/mysqlc/scripts/mysql_install_db: line 245: /home/mysql_cluster/mysqlc/bin/my_print_defaults: No such file or directory Neither host 'db' nor 'localhost' could be looked up with /home/mysql_cluster/mysqlc/bin/resolveip Please configure the 'hostname' command to return a correct hostname. If you want to solve this at a later stage, restart this script with the --force option My host is pinging and hostname command gives correct hostname. I tried also with --force option force but still not working giving following issues: /home/mysql_cluster/mysqlc/scripts/mysql_install_db: line 245: /home/mysql_cluster/mysqlc/bin/my_print_defaults: No such file or directory Installing MySQL system tables... /home/mysql_cluster/mysqlc/scripts/mysql_install_db: line 393: /home/mysql_cluster/mysqlc/bin/mysqld: No such file or directory Installation of system tables failed! Examine the logs in /home/mysql_cluster/my_cluster/mysqld_data for more information. It says above files doesn't exist but this files are present checked with ls command. We are using XEN VM with 64 bit debian lenny os. How this issue can be resolved ?

    Read the article

  • Implementing a generic repository for WCF data services

    - by cibrax
    The repository implementation I am going to discuss here is not exactly what someone would call repository in terms of DDD, but it is an abstraction layer that becomes handy at the moment of unit testing the code around this repository. In other words, you can easily create a mock to replace the real repository implementation. The WCF Data Services update for .NET 3.5 introduced a nice feature to support two way data bindings, which is very helpful for developing WPF or Silverlight based application but also for implementing the repository I am going to talk about. As part of this feature, the WCF Data Services Client library introduced a new collection DataServiceCollection<T> that implements INotifyPropertyChanged to notify the data context (DataServiceContext) about any change in the association links. This means that it is not longer necessary to manually set or remove the links in the data context when an item is added or removed from a collection. Before having this new collection, you basically used the following code to add a new item to a collection. Order order = new Order {   Name = "Foo" }; OrderItem item = new OrderItem {   Name = "bar",   UnitPrice = 10,   Qty = 1 }; var context = new OrderContext(); context.AddToOrders(order); context.AddToOrderItems(item); context.SetLink(item, "Order", order); context.SaveChanges(); Now, thanks to this new collection, everything is much simpler and similar to what you have in other ORMs like Entity Framework or L2S. Order order = new Order {   Name = "Foo" }; OrderItem item = new OrderItem {   Name = "bar",   UnitPrice = 10,   Qty = 1 }; order.Items.Add(item); var context = new OrderContext(); context.AddToOrders(order); context.SaveChanges(); In order to use this new feature, you first need to enable V2 in the data service, and then use some specific arguments in the datasvcutil tool (You can find more information about this new feature and how to use it in this post). DataSvcUtil /uri:"http://localhost:3655/MyDataService.svc/" /out:Reference.cs /dataservicecollection /version:2.0 Once you use those two arguments, the generated proxy classes will use DataServiceCollection<T> rather than a simple ObjectCollection<T>, which was the default collection in V1. There are some aspects that you need to know to use this feature correctly. 1. All the entities retrieved directly from the data context with a query track the changes and report those to the data context automatically. 2. A entity created with “new” does not track any change in the properties or associations. In order to enable change tracking in this entity, you need to do the following trick. public Order CreateOrder() {   var collection = new DataServiceCollection<Order>(this.context);   var order = new Order();   collection.Add(order);   return order; } You basically need to create a collection, and add the entity to that collection with the “Add” method to enable change tracking on that entity. 3. If you need to attach an existing entity (For example, if you created the entity with the “new” operator rather than retrieving it from the data context with a query) to a data context for tracking changes, you can use the “Load” method in the DataServiceCollection. var order = new Order {   Id = 1 }; var collection = new DataServiceCollection<Order>(this.context); collection.Load(order); In this case, the order with Id = 1 must exist on the data source exposed by the Data service. Otherwise, you will get an error because the entity did not exist. These cool extensions methods discussed by Stuart Leeks in this post to replace all the magic strings in the “Expand” operation with Expression Trees represent another feature I am going to use to implement this generic repository. Thanks to these extension methods, you could replace the following query with magic strings by a piece of code that only uses expressions. Magic strings, var customers = dataContext.Customers .Expand("Orders")         .Expand("Orders/Items") Expressions, var customers = dataContext.Customers .Expand(c => c.Orders.SubExpand(o => o.Items)) That query basically returns all the customers with their orders and order items. Ok, now that we have the automatic change tracking support and the expression support for explicitly loading entity associations, we are ready to create the repository. The interface for this repository looks like this,public interface IRepository { T Create<T>() where T : new(); void Update<T>(T entity); void Delete<T>(T entity); IQueryable<T> RetrieveAll<T>(params Expression<Func<T, object>>[] eagerProperties); IQueryable<T> Retrieve<T>(Expression<Func<T, bool>> predicate, params Expression<Func<T, object>>[] eagerProperties); void Attach<T>(T entity); void SaveChanges(); } The Retrieve and RetrieveAll methods are used to execute queries against the data service context. While both methods receive an array of expressions to load associations explicitly, only the Retrieve method receives a predicate representing the “where” clause. The following code represents the final implementation of this repository.public class DataServiceRepository: IRepository { ResourceRepositoryContext context; public DataServiceRepository() : this (new DataServiceContext()) { } public DataServiceRepository(DataServiceContext context) { this.context = context; } private static string ResolveEntitySet(Type type) { var entitySetAttribute = (EntitySetAttribute)type.GetCustomAttributes(typeof(EntitySetAttribute), true).FirstOrDefault(); if (entitySetAttribute != null) return entitySetAttribute.EntitySet; return null; } public T Create<T>() where T : new() { var collection = new DataServiceCollection<T>(this.context); var entity = new T(); collection.Add(entity); return entity; } public void Update<T>(T entity) { this.context.UpdateObject(entity); } public void Delete<T>(T entity) { this.context.DeleteObject(entity); } public void Attach<T>(T entity) { var collection = new DataServiceCollection<T>(this.context); collection.Load(entity); } public IQueryable<T> Retrieve<T>(Expression<Func<T, bool>> predicate, params Expression<Func<T, object>>[] eagerProperties) { var entitySet = ResolveEntitySet(typeof(T)); var query = context.CreateQuery<T>(entitySet); foreach (var e in eagerProperties) { query = query.Expand(e); } return query.Where(predicate); } public IQueryable<T> RetrieveAll<T>(params Expression<Func<T, object>>[] eagerProperties) { var entitySet = ResolveEntitySet(typeof(T)); var query = context.CreateQuery<T>(entitySet); foreach (var e in eagerProperties) { query = query.Expand(e); } return query; } public void SaveChanges() { this.context.SaveChanges(SaveChangesOptions.Batch); } } For instance, you can use the following code to retrieve customers with First name equal to “John”, and all their orders in a single call. repository.Retrieve<Customer>(    c => c.FirstName == “John”, //Where    c => c.Orders.SubExpand(o => o.Items)); In case, you want to have some pre-defined queries that you are going to use across several places, you can put them in an specific class. public static class CustomerQueries {   public static Expression<Func<Customer, bool>> LastNameEqualsTo(string lastName)   {     return c => c.LastName == lastName;   } } And then, use it with the repository. repository.Retrieve<Customer>(    CustomerQueries.LastNameEqualsTo("foo"),    c => c.Orders.SubExpand(o => o.Items));

    Read the article

  • ADO.NET (WCF) Data Services Query Interceptor Hangs IIS

    - by PreMagination
    I have an ADO.NET Data Service that's supposed to provide read-only access to a somewhat complex database. Logically I have table-per-type (TPT) inheritance in my data model but the EDM doesn't implement inheritance. (Limitation of EF and navigation properties on derived types. STILL not fixed in EF4!) I can query my EDM directly (using a separate project) using a copy of the query I'm trying to run against the web service, results are returned within 10 seconds. Disabling the query interceptors I'm able to make the same query against the web service, results are returned similarly quickly. I can enable some of the query interceptors and the results are returned slowly, up to a minute or so later. Alternatively, I can enable all the query interceptors, expand less of the properties on the main object I'm querying, and results are returned in a similar period of time. (I've increased some of the timeout periods) Up til this point Sql Profiler indicates the slow-down is the database. (That's a post for a different day) But when I enable all my query interceptors and expand all the properties I'd like to have the IIS worker process pegs the CPU for 20 minutes and a query is never even made against the database. This implies to me that yes, my implementation probably sucks but regardless the Data Services "tier" is having an issue it shouldn't. WCF tracing didn't reveal anything interesting to my untrained eye. Details: Data model: Agent-Person-Student Student has a collection of referrals Students and referrals are private, queries against the web service should only return "your" students and referrals. This means Person and Agent need to be filtered too. Other entities (Agent-Organization-School) can be accessed by anyone who has authenticated. The existing security model is poorly suited to perform this type of filtering for this type of data access, the query interceptors are complicated and cause EF to generate some entertaining sql queries. Sample Interceptor [QueryInterceptor("Agents")] public Expression<Func<Agent, Boolean>> OnQueryAgents() { //Agent is a Person(1), Educator(2), Student(3), or Other Person(13); allow if scope permissions exist return ag => (ag.AgentType.AgentTypeId == 1 || ag.AgentType.AgentTypeId == 2 || ag.AgentType.AgentTypeId == 3 || ag.AgentType.AgentTypeId == 13) && ag.Person.OrganizationPersons.Count<OrganizationPerson>(op => op.Organization.ScopePermissions.Any<ScopePermission> (p => p.ApplicationRoleAccount.Account.UserName == HttpContext.Current.User.Identity.Name && p.ApplicationRoleAccount.Application.ApplicationId == 124) || op.Organization.HierarchyDescendents.Any<OrganizationsHierarchy>(oh => oh.AncestorOrganization.ScopePermissions.Any<ScopePermission> (p => p.ApplicationRoleAccount.Account.UserName == HttpContext.Current.User.Identity.Name && p.ApplicationRoleAccount.Application.ApplicationId == 124))) > 0; } The query interceptors for Person, Student, Referral are all very similar, ie they traverse multiple same/similar tables to look for ScopePermissions as above. Sample Query var referrals = (from r in service.Referrals .Expand("Organization/ParentOrganization") .Expand("Educator/Person/Agent") .Expand("Student/Person/Agent") .Expand("Student") .Expand("Grade") .Expand("ProblemBehavior") .Expand("Location") .Expand("Motivation") .Expand("AdminDecision") .Expand("OthersInvolved") where r.DateCreated >= coupledays && r.DateDeleted == null select r); Any suggestions or tips would be greatly associated, for fixing my current implementation or in developing a new one, with the caveat that the database can't be changed and that ultimately I need to expose a large portion of the database via a web service that limits data access to the data authorized for, for the purpose of data integration with multiple outside parties. THANK YOU!!!

    Read the article

  • MySQL replicate multiple places

    - by Frederik Nielsen
    Very trick task to find a good title for this question, but here goes the q: I have a few development machines, where I develop my PHP applications on, and testing via a local webserver. This works out pretty well for each machine. However, I would like to replicate the DB from my machines to a central location. So, to sum up: DEV1 - CENTRAL DEV2 - CENTRAL DEV3 - CENTRAL CENTRAL - DEV1 CENTRAL - DEV2 CENTRAL - DEV3 I hope this makes sense, as I cannot find an easy way to tell it. Basically, it is a 2-way replication, where all 4 databases contain the same info, and each of them can be updated locally, to then be pushed out to the others. Is this actually doable? All my dev machines are running Windows 7, and my central DB server is running CentOS 6.

    Read the article

  • Windows Azure Mobile Services Updates Keep Coming

    - by Clint Edmonson
    Some exciting new Windows Azure Mobile Services features were delivered to production this week. The highlights include: iPhone and iPad connectivity support via a new iOS SDK Integrated Authentication so developers can configure user authentication via Microsoft Account, Facebook, Twitter, and Google. New server-side Mobile Service script modules Access to Structured Storage, Windows Azure Blob, Table, Queues, and ServiceBus Email services through partnership with SendGrid SMS & voice services through partnership with Twilio Mobile Services hosting expanded to west coast US The iOS SDK I’m excited to share that we've announced the release of an under-development iOS client SDK for Windows Azure Mobile Services. The iOS SDK joins the Windows 8 SDK launched with Windows Azure Mobile Services as well as client SDKs released by Xamarin for MonoTouch and MonoDroid.  The native iOS SDK is for developers programming in Objective-C on the iPhone and iPad platforms. The SDK gives developers the same level of access to data storage using dynamic schematization that is available for Windows 8. Also, iOS applications can use the same authentication options available in Mobile Services. While full iOS support is still in development, the libraries are currently available on GitHub. There’s a great getting started tutorial to walk you through building a simple iOS “Todo List” app that stores data in Windows Azure.  These additional tutorials explore how to use the iOS client libraries to store data and authenticate users: Get Started with data in Mobile Services for iOS Get Started with authentication in Mobile Services for iOS What’s New in Authentication Available to both iOS and Windows 8 developers, Mobile Services has expanded its authentication options.  Developers can now use Microsoft, Facebook, Twitter, and Google authentication. Similar to using Microsoft accounts for authentication, developers must sign up and through Facebook, Twitter, or Google's developer portal in order to authenticate through them.  These tutorials walk through how to register your Mobile Service with an identity provider: How to register your app with Microsoft Account How to register your app with Facebook How to register your app with Twitter How to register your app with Google And these tutorials walk through authenticating against Mobile Services: Get started with authentication in Mobile Services for Windows Store (C#) Get started with authentication in Mobile Services for Windows Store (JavaScript) Get started with authentication in Mobile Services for iOS What’s New in Mobile Service Scripts Some great new functionality is now available in the Mobile Service script layer.  These server side scripts are triggered off of any CRUD operation on a Mobile Service's table and can already handle doing data and query validation, filtering, web requests and more.  Today, the Azure SDK module is now available to these scripts giving them access to blob storage, service bus, table storage.  Check out the new tutorials on the Windows Azure Node.js developer center to learn more about working with Blob, Tables, Queues and Service Bus using the azure module. In addition, SendGrid and Twilio are now available via modules that can be called from the scripts as well.  This gives developers the ability to send emails (SendGrid) or SMS text messages (Twilio) whenever a script is fired.  Windows Azure customers receive a special offer of 25,000 free emails per month from SendGrid and 1000 free text messages from Twilio. Expanded Data Center Availability In addition to Mobile Services being available in our US East data center, they can now be spun up in US West. The above features are all now live in production and are available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using Mobile Services today. The Windows Azure Mobile Developer Center has been updated with new tutorials that cover these new features in detail. And don’t forget - Windows Azure Mobile Services are still free for your first ten applications running on shared compute instances. Stay tuned to my twitter feed for Windows Azure announcements, updates, and links: @clinted

    Read the article

  • File corruption after copying files in Windows 7 64 bit using two methods

    - by DustByte
    I have 5000 pictures and other files in a directory taking up 35 GB. I want to duplicate this directory. Method 1: I do a simple copy and paste of the directory in explorer. I have the habit of checking the checksums after copying important files. In this case I noticed that around 2000 files failed the MD5 test. At a closer inspection of a randomly chosen JPEG with different checksums it turns out that some XMP metadata had changed. In particular, the tag <MicrosoftPhoto:DateAcquired> had changed the date from 2009 to today (possibly around the time I was copying the files). I have no idea what triggered this XMP data to be changed and exactly when it was changed and why for these particular files, but at least it seems to explain the checksum discrepancy. Method 2: As I want the exact files to be duplicated, I tried the program FreeFileSync to mirror the directory, hoping no XMP metadata would mysteriously change. A checksum test in addition to a thorough file comparison test in FreeFileSync lead to two similar but yet different results: 31 files fail the checksum test, 23 files fail the file comparison test. The smaller set is not entirely contained in the bigger set, although many files occur in both. What is alarming here is that not only JPEGs are flagged as altered but also som AVIs, MPGs and a large 7-zip file. Closer inspection of a JPEG indicates that it is indeed corrupt: the bottom half of the picture is simply plain gray. Due to the size of the 7-zip file, I have not been able to pin down the discrepancy. Note, in both methods, every file has its correct file size after being copied. Question: Any thoughts on what is possibly going on here? I have never had this problem before, and I am now terrified that files get corrupted after simple actions like copy/paste and file sync. Even if I manage to successfully copy the files somehow, I would still like an explanation to this.

    Read the article

  • Exploding maps in Reporting Services 2008 R2

    - by Rob Farley
    Kaboom! Well, that was the imagery that secretly appeared in my mind when I saw “USA By State Exploded” in the list of installed maps in Report Builder 3.0 – part of the spatial offering of SQL Server Reporting Server 2008 R2. Alas, it just means that the borders are bigger. Clicking on it showed me. Unfortunately, I’m not interested in maps of the US. None of my clients are there (at least, not yet – feel free to get in touch if you want to change this ‘feature’ of my company). So instead, I’ve recently been getting hold of some data for Australian areas. I’ve just bought some PostCode shapes for South Australia, and will use this in demos for conferences and for showing clients how this kind of report can really impact their reporting. One of the companies I was talking about getting shape files sent me a sample. So I chose the “ESRI shapefile” option you see above, and browsed to my file. It appeared in the window like this: Australians will immediately recognise this as the area around Wollongong, just south of Sydney. Well, apart from me. I didn’t. I had to put a Bing Maps layer behind it to work that out, but that’s not for this post. The thing that I discovered was that if I selected the Exploded USA option (but without clicking Next), and then chose my shape file, then my area around Wollongong would be exploded too! Huh! I think this is actually a bug, but a potentially useful one! Some further investigation (involving creating two identical reports, one with this exploded view, one without), showed that the Exploded View is done by reducing the ScaleFactor property of the PolygonLayer in the map control. The Exploded version has it below 1. If you set to above one, your shapes overlap. I discovered this by accident… I guess I hadn’t looked through all the PolygonLayer options to work out what they all do. And because this post is about Reporting, it can qualify for this month’s T-SQL Tuesday, hosted by Aaron Nelson (@sqlvariant). Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • FOSS ASP.Net Session Replication Solution?

    - by jsight
    I've been searching (with little success) for a free/opensource session clustering and replication solution for asp.net. I've run across the usual suspects (indexus sharedcache, memcached), however, each has some limitations. Indexus - Very immature, stubbed session interface implementation. Its otherwise a great caching solution, though. Memcached - Little replication/failover support without going to a db backend. Several SF.Net projects - All aborted in the early stages... nothing that appears to have any traction, and one which seems to have gone all commercial. Microsoft Velocity - Not OSS, but seems nice. Unfortunately, I didn't see where CTP1 supported failover, and there is no clear roadmap for this one. I fear that this one could fall off into the ether like many other MS dev projects. I am fairly used to the Java world where it is kind of taken for granted that many solutions to problems such as this will be available from the FOSS world. Are there any suitable alternatives available on the .Net world?

    Read the article

  • Sql Server Replication: Snapshot vs Merge

    - by Zyphrax
    Background information Let's say I have two database servers, both SQL Server 2008. One is in my LAN (ServerLocal), the other one is on a remote hosting environment (ServerRemote). I have created a database on ServerLocal and have an exact copy of that database on ServerRemote. The database on ServerRemote is part of a web application and I would like to keep it's data up-to-date with the data in the database ServerLocal. ServerLocal is able to communicate with ServerRemote, this is one-way traffic. Communication from ServerRemote to ServerLocal isn't available. Current solution I thought it would be a nice solution to use replication. So I've made ServerLocal a publisher and subscriptions are pushed to the ServerRemote. This works fine, when a snapshot is transfered to ServerRemote the existing data will be purged and the ServerRemote database is once again an exact replica of the database on ServerLocal. The problem Records that exist on ServerRemote that don't exist on ServerLocal are removed. This doesn't matter for most of my tables but in some of my tables I'd like to keep the existing data (aspnet_users for instance), and update the records if necessary. What kind of replication fits my problem?

    Read the article

  • How to synchronize two (or n) replication processes for SQL Server databases?

    - by Yauheni Sivukha
    There are two master databases and two read-only copies updated by standard transactional replication. It is needed to map some entity from both read-only databases, lets say that A databases contains orders and B databases contains lines. The problem is that replication to one database can lag behind replication of second database, and at the moment of mapping R-databases will have inconsistent data. For example. We stored 2 orders with lines at 19:00 and 19:03. Mapping process started at 19:05, but to the moment of mapping A database replication processed all changes up to 19:03, but B database replication processed only changes up to 19:00. After mapping we will have order entity with order as of 19:03 and lines as of 19:00. The troubles are guaranteed:) In my particular case both databases have temporal model, so it is possible to fetch data for every time slice, but the problem is to identify time of latest replication. Question: How to synchronize replication processes for several databases to avoid situation described above? Or, in other words, how to compare last time of replication in each database? UPD: The only way I see to synchronize is to continuously write timestamps into service tables in each database and to check these timestamps on replicated servers. Is that acceptable solution?

    Read the article

  • Application losing Printer within Terminal Services for remote users

    - by Richard
    Question: What I need to do is have a permanent link to a printer, normally only accessible through Terminal Services (Printer Redirect), to allow Sage Line 50 layouts to see that printer persistently, even after users have disconnected and reconnected to the Terminal Services session? Although the printer is accessible each time a user connects to the Sage Server via Terminal Services, it is given a different session number and therefore the Sage Layout sees it as a different printer. History behind question: Users using Terminal Services connecting to a Sage Server on a different site Using Sage Line 50 v 15 on that Server Users want to print invoices (sage layouts) locally Sage Server cannot see the users local printers, to get around this user uses the Print redirect features of Terminal Services The individual reports can be edited to point to a specific printer by default. This means the user just has to select an invoice and click print, then select the layout/report wanted and it auto prints that invoice to the default printer specified. The problem occurs because the layouts are edited to point to the users local printer "Ricoh 1018d (session#)", note the "(session#)" as this is the users local printer being redirected through the terminal services session. Users are able to print using the sage layouts once the default printer is setup within the layout and saved, but as soon as the users disconnects from the Terminal Services session and then reconnect in the morning go to print, it has lost the connection to that printer. I understand why its failed, because that the printer is on a per session basis and the layout would not be able to hold on to the connection from a previous session. Thanks in advance for any assistance...

    Read the article

  • How to restrict file system when logged into terminal services

    - by pghcpa
    What I need to accomplish: With one login, when user is physically in the building I need them to see everything. When they are using terminal services with same login they should not be able to see the file system on the network. I can lock down the PC running terminal services as that is its only use. Details: Windows/2003 Server with terminal services. One login for a user (e.g., johndoe). When johndoe logs into the network at his desk in the office, he can see the network files according to group policy. When johndoe logs into terminal services from outside the building, we do not want to allow him see the network. Using 2x to do a published app, but that app has a "feature" that allows user to see network. Published application on termina services (only) is a document management system that is tied to windows login, so I can't give them two logins. With one login, when they are in the building I need them to see everything. When they are using terminal services they should not be able to see the network. I can lock down the PC running terminal services as that is its only use.

    Read the article

  • Application losing Printer within Terminal Services for remote users

    - by Richard
    Question: What I need to do is have a permanent link to a printer, normally only accessible through Terminal Services (Printer Redirect), to allow Sage Line 50 layouts to see that printer persistently, even after users have disconnected and reconnected to the Terminal Services session? Although the printer is accessible each time a user connects to the Sage Server via Terminal Services, it is given a different session number and therefore the Sage Layout sees it as a different printer. History behind question: Users using Terminal Services connecting to a Sage Server on a different site Using Sage Line 50 v 15 on that Server Users want to print invoices (sage layouts) locally Sage Server cannot see the users local printers, to get around this user uses the Print redirect features of Terminal Services The individual reports can be edited to point to a specific printer by default. This means the user just has to select an invoice and click print, then select the layout/report wanted and it auto prints that invoice to the default printer specified. The problem occurs because the layouts are edited to point to the users local printer "Ricoh 1018d (session#)", note the "(session#)" as this is the users local printer being redirected through the terminal services session. Users are able to print using the sage layouts once the default printer is setup within the layout and saved, but as soon as the users disconnects from the Terminal Services session and then reconnect in the morning go to print, it has lost the connection to that printer. I understand why its failed, because that the printer is on a per session basis and the layout would not be able to hold on to the connection from a previous session. Thanks in advance for any assistance...

    Read the article

  • Benefits of PerformancePoint Services Using SharePoint Server 2010

    - by Wayne
    What is PerformancePoint Services? Most of the time it happens that the metrics that make up your key performance indicators are not simple values from a data source. In SharePoint Server 2007 PerformancePoint Services, you could create two kinds of KPI metrics: Simple single value metrics from any supported data source or Complex multiple value metrics from a single Analysis Services data source using MDX. Now things are even easier with Performance Point Services in SharePoint 2010. Let us check what is it? PerformancePoint Services in SharePoint Server 2010 is a performance management service that you can use to monitor and analyze your business. By providing flexible, easy-to-use tools for building dashboards, scorecards, reports, and key performance indicators (KPIs), PerformancePoint Services can help everyone across an organization make informed business decisions that align with companywide objectives and strategy. Scorecards, dashboards, and KPIs help drive accountability. Integrated analytics help employees move quickly from monitoring information to analyzing it and, when appropriate, sharing it throughout the organization. Prior to the addition of PerformancePoint Services to SharePoint Server, Microsoft Office PerformancePoint Server 2007 functioned as a standalone server. Now PerformancePoint functionality is available as an integrated part of the SharePoint Server Enterprise license, as is the case with Excel Services in Microsoft SharePoint Server 2010. The popular features of earlier versions of PerformancePoint Services are preserved along with numerous enhancements and additional functionality. New PerformancePoint Services features PerformancePoint Services now can utilize SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. Dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. New features and enhancements of SharePoint 2010 PerformancePoint Services • With PerformancePoint Services, functioning as a service in SharePoint Server, dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. The new architecture also takes advantage of SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. You also can include and link PerformancePoint Services Web Parts with other SharePoint Server Web Parts on the same page. The new architecture also streamlines security models that simplify access to report data. • The Decomposition Tree is a new visualization report type available in PerformancePoint Services. You can use it to quickly and visually break down higher-level data values from a multi-dimensional data set to understand the driving forces behind those values. The Decomposition Tree is available in scorecards and analytic reports and ultimately in dashboards. • You can access more detailed business information with improved scorecards. Scorecards have been enhanced to make it easy for you to drill down and quickly access more detailed information. PerformancePoint scorecards also offer more flexible layout options, dynamic hierarchies, and calculated KPI features. Using this enhanced functionality, you can now create custom metrics that use multiple data sources. You can also sort, filter, and view variances between actual and target values to help you identify concerns or risks. • Better Time Intelligence filtering capabilities that you can use to create and use dynamic time filters that are always up to date. Other improved filters improve the ability for dashboard users to quickly focus in on information that is most relevant. • Ability to include and link PerformancePoint Services Web Parts together with other PerformancePoint Services Web parts on the same page. • Easier to author and publish dashboard items by using Dashboard Designer. • SQL Server Analysis Services 2008 support. • Increased support for accessibility compliance in individual reports and scorecards. • The KPI Details report is a new report type that displays contextually relevant information about KPIs, metrics, rows, columns, and cells within a scorecard. The KPI Details report works as a Web part that links to a scorecard or individual KPI to show relevant metadata to the end user in SharePoint Server. This Web part can be added to PerformancePoint dashboards or any SharePoint Server page. • Create analytics reports to better understand underlying business forces behind the results. Analytic reports have been enhanced to support value filtering, new chart types, and server-based conditional formatting. To conclude, PerformancePoint Services, by becoming tightly integrated with SharePoint Server 2010, takes advantage of many enterprise-level SharePoint Server 2010 features. Unfortunately, SharePoint Foundation 2010 doesn’t include this feature. There are still many choices in SharePoint family of products that include SharePoint Server 2010, SharePoint Foundation, SharePoint Server 2007 and associated free SharePoint web parts and templates.

    Read the article

  • Make a file non-deletable in USB

    - by MegaNairda
    Somebody used my USB drive and upon returning it to me, I found a autorun.inf that is undeletable. I tried changing it's file attribute which is only H (not even set as a system file) but it keeps on saying Access Denied. The USB is set on FAT32, upon asking my friend, he told me that he uses Panda USB Vaccine http://research.pandasecurity.com/Panda-USB-and-AutoRun-Vaccine/ How do they do this? Im trying to use some Disk Sector editor but have no idea which hex file they change to make this kind of file and make it deletable again. Formatting the drive removes it, but I'm curious as to how to be able to set those kind of file attribute.

    Read the article

  • SQL SERVER – Data Sources and Data Sets in Reporting Services SSRS

    - by Pinal Dave
    This example is from the Beginning SSRS by Kathi Kellenberger. Supporting files are available with a free download from the www.Joes2Pros.com web site. This example is from the Beginning SSRS. Supporting files are available with a free download from the www.Joes2Pros.com web site. Connecting to Your Data? When I was a child, the telephone book was an important part of my life. Maybe I was just a nerd, but I enjoyed getting a new book every year to page through to learn about the businesses in my small town or to discover where some of my school acquaintances lived. It was also the source of maps to my town’s neighborhoods and the towns that surrounded me. To make a phone call, I would need a telephone number. In order to find a telephone number, I had to know how to use the telephone book. That seems pretty simple, but it resembles connecting to any data. You have to know where the data is and how to interact with it. A data source is the connection information that the report uses to connect to the database. You have two choices when creating a data source, whether to embed it in the report or to make it a shared resource usable by many reports. Data Sources and Data Sets A few basic terms will make the upcoming choses make more sense. What database on what server do you want to connect to? It would be better to just ask… “what is your data source?” The connection you need to make to get your reports data is called a data source. If you connected to a data source (like the JProCo database) there may be hundreds of tables. You probably only want data from just a few tables. This means you want to write a specific query against this data source. A query on a data source to get just the records you need for an SSRS report is called a Data Set. Creating a local Data Source You can connect embed a connection from your report directly to your JProCo database which (let’s say) is installed on a server named Reno. If you move JProCo to a new server named Tampa then you need to update the Data Set. If you have 10 reports in one project that were all pointing to the JProCo database on the Reno server then they would all need to be updated at once. It’s possible to make a project level Data Source and have each report use that. This means one change can fix all 10 reports at once. This would be called a Shared Data Source. Creating a Shared Data Source The best advice I can give you is to create shared data sources. The reason I recommend this is that if a database moves to a new server you will have just one place in Report Manager to make the server name change. That one change will update the connection information in all the reports that use that data source. To get started, you will start with a fresh project. Go to Start > All Programs > SQL Server 2012 > Microsoft SQL Server Data Tools to launch SSDT. Once SSDT is running, click New Project to create a new project. Once the New Project dialog box appears, fill in the form, as shown in. Be sure to select Report Server Project this time – not the wizard. Click OK to dismiss the New Project dialog box. You should now have an empty project, as shown in the Solution Explorer. A report is meant to show you data. Where is the data? The first task is to create a Shared Data Source. Right-click on the Shared Data Sources folder and choose Add New Data Source. The Shared Data Source Properties dialog box will launch where you can fill in a name for the data source. By default, it is named DataSource1. The best practice is to give the data source a more meaningful name. It is possible that you will have projects with more than one data source and, by naming them, you can tell one from another. Type the name JProCo for the data source name and click the Edit button to configure the database connection properties. If you take a look at the types of data sources you can choose, you will see that SSRS works with many data platforms including Oracle, XML, and Teradata. Make sure SQL Server is selected before continuing. For this post, I am assuming that you are using a local SQL Server and that you can use your Windows account to log in to the SQL Server. If, for some reason you must use SQL Server Authentication, choose that option and fill in your SQL Server account credentials. Otherwise, just accept Windows Authentication. If your database server was installed locally and with the default instance, just type in Localhost for the Server name. Select the JProCo database from the database list. At this point, the connection properties should look like. If you have installed a named instance of SQL Server, you will have to specify the server name like this: Localhost\InstanceName, replacing the InstanceName with whatever your instance name is. If you are not sure about the named instance, launch the SQL Server Configuration Manager found at Start > All Programs > Microsoft SQL Server 2012 > Configuration Tools. If you have a named instance, the name will be shown in parentheses. A default instance of SQL Server will display MSSQLSERVER; a named instance will display the name chosen during installation. Once you get the connection properties filled in, click OK to dismiss the Connection Properties dialog box and OK again to dismiss the Shared Data Source properties. You now have a data source in the Solution Explorer. What’s next I really need to thank Kathi Kellenberger and Rick Morelan for sharing this material for this 5 day series of posts on SSRS. To get really comfortable with SSRS you will get to know the different SSDT windows, Build reports on your own (without the wizards),  Add report headers and footers, Accept user input,  create levels, charts, or even maps for visual appeal. You might be surprise to know a small 230 page book starts from the very beginning and covers the steps to do all these items. Beginning SSRS 2012 is a small easy to follow book so you can learn SSRS for less than $20. See Joes2Pros.com for more on this and other books. If you want to learn SSRS in easy to simple words – I strongly recommend you to get Beginning SSRS book from Joes 2 Pros. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Reporting Services, SSRS

    Read the article

  • Preloading multiple comboboxes/listbox itemssource with enumerated values using WCF RIA Services

    - by Dale Halliwell
    I would like to be able to load several RIA entitysets in a single call without chaining/nesting several small LoadOperations together so that they load sequentially. I have several pages that have a number of comboboxes on them. These comboboxes are populated with static values from a database (for example status values). Right now I preload these values in my VM by one method that strings together a series of LoadOperations for each type that I want to load. For example: public void LoadEnums() { context.Load(context.GetMyStatusValues1Query()).Completed += (s, e) => { this.StatusValues1 = context.StatusValues1; context.Load(context.GetMyStatusValues2()).Completed += (s1, e1) => { this.StatusValues2 = context.StatusValues2; context.Load(context.GetMyStatusValues3Query()).Completed += (s2, e2) => { this.StatusValues3 = context.StatusValues3; (....and so on) }; }; }; }; While this works fine, it seems a bit nasty. Also, I would like to know when the last loadoperation completes so that I can load whatever entity I want to work on after this, so that these enumerated values resolve properly in form elements like comboboxes and listboxes. (I think) I can't do this easily above without creating a delegate and calling that on the completion of the last loadoperation. So my question is: does anyone out there know a better pattern to use, ideally where I can load all my static entitysets in a single LoadOperation?

    Read the article

  • no such file to load -- yet file is config.gem in environtment.rb file in rails

    - by Angela
    I keep getting this error for different gems, the most recent is acts_as_reportable: no such file to load -- acts_as_reportable However, I have the following line in my environment.rb: config.gem 'acts_as_reportable' And I also ran the following: >gem install acts_as_reportable It it output: >gem install acts_as_reportable Successfully installed acts_as_reportable-1.1.1 1 gem installed Installing ri documentation for acts_as_reportable-1.1.1... Installing RDoc documentation for acts_as_reportable-1.1.1... What do I need to do?

    Read the article

  • Database Replication

    - by tanthiamhuat
    I have tried to follow the steps outlined in http://www.howtoforge.com/mysql_database_replication for Database Replication. I have created the database exampledb, and create some tables and load them with values. But when I execute USE exampledb; FLUSH TABLES WITH READ LOCK; SHOW MASTER STATUS; I do not get any output, it says 0 rows affected. why is it so?

    Read the article

  • Mysql Replication

    - by ychian
    My current database design uses MyIsam mainly as the storage engine, I wonder if its possible to split some of the tables into MyIsam and some into Innodb in the same database. Reason of switching some of the tables to Innodb is because i need row-based locking which Innodb offers. I am not too sure whether this would have any effect on replication?

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >