Search Results

Search found 3046 results on 122 pages for 'tom smith'.

Page 113/122 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • Programmatically add/delete users in Exchange

    - by Terry Gamble
    I've got the following set up: ASP.Net site that allows my internal employees to add in new hire information (no secure data, just stuff like name/address/phone) and when they submit this it goes into a database (SQL). Every few minutes a service runs that checks the database and if there are new entries it will add them into Exchange. The issue is I'm not happy with the way the service is doing things, (It's not putting address, etc in it). As I don't have the source code this I'm thinking of recreating it. My issue though is finding a starting point even. I know I'll have to create the scripts through code where the data is retrieved from SQL : Joe Smith 123 Main Street Nowhere, USA 19999 And put that into a powershell cmdlet (not sure exactly the syntax but I can get that figured out unless someone already has it) where the user is created in the Active Directory as a normal user and the mailbox is created simultaneously. From there I just need to fill out fields in Active Directory with the person's address, etc. Finally a deletion routine for when we terminate someone, however I'm sure that it will simply be a cmdlet that is easily shelled out to much like the initial one is, once I can figure out how to start that... Anyone have some good reference points or have already done it and can share?

    Read the article

  • SQL 2008 Replication corrupt data problem

    - by Jonathan K
    We took a SQL 2000 database. Took a lightspeed backup. Restored on SQL 2008 active/passive cluster. Then setup replication to replicate the data back to SQL 2000. So 2008 is the publisher/distributor, and 2000 is doing a pull subscription. Everything works well, execpt we occassionally get corrupt data in varchar/text fields on the subscriber. So for example we have a table with 4500 records. When we run this statement: update MedstaffProvider set Notes = 'Cell Phone: 360.123.4567 Answering Service: 360.123.9876' where LastName = 'smith' The record in the 2008 database is updated as expected. But in the subsriber datbase we'll get gibberish in the notes field: óPÌ[1] T $Oé[1] ð²ñ. K Here's what we know: This is repeatable, meaning we can run that same query all day long and get the same gibberish. If you alter update statement slightly the data gets replicated just fine. The collation on both databases is the same. So far we've only detected the problem with text/varchar fields. (The notes field above is text). Only one or two records in a table are impacted. The table structure looks identical in both 2000/2008. We haven't made any changes. We have found one solution that fixes the problem. Basically if we recreate the table in 2008 (say as MedStaffProvider2) and then insert all the data. Drop the original table. Rename the table to it's original name. Setup replication again. And run the exact same update statement it works as expected. Does anyone have any idea what might be happening here? Or are there any other techniques we can use to troubleshoot this? I've found a solution for this, but would really like to undertsand why this is happening.

    Read the article

  • Slow performance on VMWare Linux server after Tomcat install

    - by Loftx
    We have a VMWare ESXi 4.1 server hosting a number of Linux and Windows guests. Recently a new Linux guest was added to this server and seemed to be performing well. Tomcat and some other applications on this server were then installed which seem to have caused the server to run really slowly without any obvious resource issues. Slow performance include: The time taken to bring up the password prompt over ssh takes a few seconds when it was previously instantaneous. The time taken to unzip a zip file which was previously a few seconds now takes around 30 seconds The time taken to compile vmware tools has increased by similar factors Both the VMWare console and monitoring commands don't report any issues with high CPU or memory usage but something is obviously slowing the server down somehow. Does anyone have any ideas what may be causing this issue and how it can be resolved? Thanks, Tom Edit As per your questions I’ve looked at some of the performance indicators on both the VM host and VM guest indicated. Firstly I tried reserving the full amount of memory (3gb) for this VM – no other machines on this server have any memory reservation. The swap in rate and swap out rate for the VM host and guest are now both zero. Balloon memory on the guest is zero and on the host is 3.5gb (total memory on the host is 12gb) The swap rate for the guest is also zero. Swap used by the host is 200mb on average. Compression and decompression rates for the host and guest are zero. Command aborts for the host are zero. Read latency is very low – maximum 10ms average 0.8ms. Write latency is higher – a few spikes to 170ms but mostly around 25ms – is this bad? Queue command latency is zero . Physical disk read latency averages 5ms but often 10ms Physical disk write latency averages 15ms but is often 20ms I hope this helps - let me know if you need any more information.

    Read the article

  • Mail queue directory stuck in IIS SMTP server

    - by Loftx
    Hi there, We have an IIS SMTP server which sends out a largish number of mails (4000 or so) in batches overnight, and recently we've seen mails get "stuck" in the queue directory. Normally restarting the SMTP service seems to fix this, but it's happened a few times so I'm looking for more information. We sent out around 12,000 emails last night in 3 batches of roughly 4000. Around 10 hours later there are still 2000 or so in the queue directory which don't seem to be leaving the queue. Any new mails which appear in the queue are picked up almost immediately and sent to their destination, but these 2000 or so don't seem to move. Looking at the date modified on the emails some match up with the time they were sent, but around 1000 of them have modified dates stretching up to now. e.g. there was one mail with a date in the message headers of 5:30 this morning, but it's date modified is 11:50 and there are 3 other messages with a date modified of 11:50, then 5 with 11:49, 2 with 11:45 stretching back for a few hours and all with actual message headers far earlier. The logs for the server look like this 11:54:52 127.0.0.1 EHLO - 250 11:54:52 127.0.0.1 MAIL - 250 11:54:52 127.0.0.1 RCPT - 250 11:54:52 127.0.0.1 DATA - 250 11:54:52 127.0.0.1 QUIT - 240 11:54:53 85.115.62.190 - - 0 11:54:53 85.115.62.190 EHLO - 0 11:54:53 85.115.62.190 - - 0 11:54:53 85.115.62.190 MAIL - 0 11:54:53 85.115.62.190 - - 0 11:54:53 85.115.62.190 RCPT - 0 11:54:53 85.115.62.190 - - 0 11:54:53 85.115.62.190 DATA - 0 11:54:53 85.115.62.190 - - 0 11:54:54 85.115.62.190 - - 0 11:54:54 85.115.62.190 QUIT - 0 11:54:54 85.115.62.190 - - 0 All codes are either 250 or 240 or 0. I believe 250 and 240 indicate success, but I don't know what all the 0s are. Could someone with more experience of mail server troubleshooting give me a hand or tell me what to try next. Thanks, Tom

    Read the article

  • Sluggish Windows SBS 2003

    - by TomWilsonFL
    One of my customers has a Windows 2003 Small Business Server which at this point is basically the DC, DNS, Fileserver and Symantec Protection Manager. I have disabled Exchange because I moved their mail to Google Apps. The server is extremely sluggish when doing anything. It is most noticeable when a dialog box is open (say the System properties), and you try to change tabs. This is usually instant, but on this machine can take 3-5 seconds. What additional services / packages can I uninstall from this machine knowing that it is only performing the above roles? Will removing the "Small Business Server" package in Add / Remove Programs get rid of a few unnecessary things? Any other thoughts? P.S. I know Symantec Endpoint and the Protection Manager are hogs, but I have nothing to replace the solution with at the moment. Thanks, Tom UPDATE: I looked over the different performance metrics, but nothing stood out as a problem. One of my friends mentioned Symantec's log and temp files can get quite huge and slow things down, so I ran CCleaner on the machine and found close to 3 GB of Symantec "stuff." Removed that and now the machine is MUCH better. I am still unsure why the data just sitting there would cause such a slowdown. The drive is not even near full. The only thing I can imagine is that Symantec must have to run through this stuff now and then.

    Read the article

  • Laptop accessories for mobile warrior (light power adapter & case/bag)

    - by wonsungi
    Lugging my X301 between work and home, I realized my laptop's accessories weigh more than the laptop itself! I'm ordering a 2nd AC power adapter so I don't even have to carry one at all, but I may as well get the lightest one possible. My X301 came with a pretty svelt 65W power adapter, but can anyone suggest a lighter power adapter or confirm the weights I've found below? mass vol dimensions W Model ---- ------- ----------- --- ------------------- 210g 149cm^3 108x46x30mm 65W Coolermaster [NA 65] 244g 189cm^3 140x75x18mm 65W ThermalTake [ADP65W0001] 260g 130cm^3 104x43x29mm 65W Lenovo (came with X301) 326g 198cm^3 145x76x18mm 95W Coolermaster [SNA 95] 330g 180cm^3 150x60x20mm 90W Kensington USB [K38030US] Apple's 60W power adapter seems much smaller/lighter than the PC products listed above, so I think a better PC power adapter could exist. There are much smaller 45W "netbook" adapters, but are these too weak for my X301? I would not mind if it just meant the battery couldn't charge while the laptop was on, but I am afraid there will be worse consequences. Also, I have decided to swap my Logitech Kinetik briefcase for a Tom Bihn Ristretto. Less protection, but much lighter, less bulky, and easier to carry. Any suggestions for better laptop cases/bags?

    Read the article

  • Any Recommendations for a Web Based Large File Transfer System?

    - by Glen Richards
    I'm looking for a server software product that: Allows my users to share large files with: The general public securely to 1 or more people (notification via email, optionally with a token that gives them x period of time to download) Allows anyone in the general public to share files with my users. Perhaps by invitation. Has to be user friendly enough to allow my users to use this with out having to bug me as the admin. It needs to be a system that we can install on our own server (we don't want shared data sitting on anyone else's server) A web based solution. Using some kind or secure comms channel would be good too, eg, ssh Files to share could be over 1 GB. I found the question below. WebDav does not sound user friendly enough: http://serverfault.com/questions/86878/recommendations-for-a-secure-and-simple-dropbox-system I've done a lot of searching, but I can't get the search terms right. There are too many services that provide this, but I want something we can install on our own server. A last resort would be to roll my own. Any ideas appreciated. Glen EDIT Sorry Tom and Jeff but Glen specifically says that he's looking for a 'product' so given that I specialise in this field thought that my expertise in this area may have been of use to him. I don't see how him writing services is going to be easy for him to maintain going forward (large IT admin overhead) or simple for his users and the general public to work with.

    Read the article

  • Access denied error 3221225578 with file sharing to Windows server

    - by Ian Boyd
    i'm trying to access the shares on a server. The credential box appears, and i enter in a correct username and password, and i get access denied. The silly thing is that i can Remote Desktop to the server (using the same credentials), and i can check the Security event log for the access denied errors: Event Type: Failure Audit Event Source: Security Event Category: Account Logon Event ID: 681 Date: 3/19/2011 Time: 11:54:39 PM User: NT AUTHORITY\SYSTEM Computer: STALWART Description: The logon to account: Administrator by: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0 from workstation: HARPAX failed. The error code was: 3221225578 and Event Type: Failure Audit Event Source: Security Event Category: Logon/Logoff Event ID: 529 Date: 3/19/2011 Time: 11:54:39 PM User: NT AUTHORITY\SYSTEM Computer: STALWART Description: Logon Failure: Reason: Unknown user name or bad password User Name: Administrator Domain: stalwart Logon Type: 3 Logon Process: NtLmSsp Authentication Package: NTLM Workstation Name: HARPAX Looking up the error code (3221225578), i get an article on Technet: Audit Account Logon Events By Randy Franklin Smith ... Table 1 - Error Codes for Event ID 681 Error Code Reason for Logon Failure 3221225578 The username is correct, but the password is wrong. Which would seem to indicate that the username is correct, but the password is wrong. i've tried the password many times, uppercase, lowercase, on different user accounts, with and without prefixing the username with servername\username. What gives that i cannot access the server over file sharing, but i can access it over RDP?

    Read the article

  • RDP problem with Vista and Windows 7 destination

    - by MadBison
    I use a server a home to host a bunch of concurrently running Hyper-V VM's with different OS's and software for testing. I have Vista on the laptop, all latest SP's and patches. The server is Server 2008 R2, fully patched. The guests are a mix of XP, Vista, Server 2008 and Windows 7. If I connect to the Win XP or Server 2008 guest using RDP, it is always good. Very quick, no speed issues. If I connect to the Vista or Win 7 guests, the response time is so slow it is unusable. Usually 6 or 8 seconds, and at times it is to long to measure! This happens from both the laptop running Vista, and the server running Server 2008 R2. Does anyone know what the issue is with RDP on Vista and Windows 7 destinations? I did read this: http://blog.tmcnet.com/blog/tom-keating/microsoft/remote-desktop-slow-problem-solved.asp and that is not the problem I have applied that change to all PC's.

    Read the article

  • Mail.app doesn't detect sender in Address Book

    - by CoreSandello
    I don't understand, how does 'smart addresses' in Mail.app work. Recently I mentioned, that for some emails I don't see person's full name in 'From' column. I started to dig into this behavior and found out, that I have few contacts in my Address Book, that are not recognized by Mail.app. Here how it looks: I have a person in Address Book with filled email entry and filled first/last name (localized). I have an incoming email from that person (from email specified in Address Book), but first/last name in the email itself doesn't match with ones specified in Address Book (e. g. 'From' field in email looks like 'John [work] <[email protected]>' while Address Book entry is 'John Smith' (localized, in Russian)). And Mail.app doesn't recognize that this mail is originating from that person in Address Book: if I click on 'From' field, it suggests to me to add sender to Address Book, while for others' emails I have 'Show in Address Book' menu entry (especially for ones with full localized name in 'From' field). I'm wondering, is that behavior correct or I'm missing something? I'm using Snow Leopard & Mail 4.0; my system language set to English, if that matters. I'd like to have some clarifications on that Mail.app behavior: whenever it fixable or not (and if it's fixable, I'd like to see a fix). By the way, is it possible to match sender's address against Address Book entry in filter rules or not? That would be great, if I can create rules like 'move all mail from that person to that folder' without specifying exact source address. Thanks, Ivan.

    Read the article

  • Active Directory Password Formats

    - by Brent Pabst
    Hi, I'm working on an open source project that will manage active directory users. I am looking for feedback from Windows/Active Directory Admins on the formats of usernames they prefer or their organization uses. I want to make sure the software allows admins to use the most popular formats when new users are created. Here is the list I have so far: 1. <firstname><lastname> 2. <lastname><firstname> 3. <lastname><firstinitial> 4. <lastname><firstinitial><middleinitial> 5. <firstinitial><lastname> 6. <firstinitial><middleinitial><lastname> 7. <firstname><lastinitial> In addition how do you handle multiple identical names? So if two John Smith's exist do you append a numeric number, or interject a middle initial or name to solve the problem? Thanks for the feedback

    Read the article

  • maximum number of connections Squid

    - by Isaac
    I have a Squid proxy server that controls all internet traffic for my network. I need a way to stop users from downloading big files (say 50MB) in my network. I banned some famous ports (e.g. torrent) but some downloads are possible by HTTP port. Obviously I cannot ban port 80! A simple solution is limiting maxmimum number of the simultaneous connections for each IP (e.g. 3 connections). It's possible in Squid with this config: acl ACCOUNTSDEPT 192.168.5.0/24 acl limitusercon maxconn 3 http_access deny ACCOUNTSDEPT limitusercon But this solution has really bad impact in web browsing, because any smart browser get different parts of a website by several connections simultaneously to speedup web browsing. But if we have a maximum number of connections, the browsers will fail to get some parts and the website will be shown partially and some parts/images/frames will not be shown. So, can we limit maximum number of persist connections? I think this policy will works: Specify Maximum number of connections that is alive for 10 seconds But Number of simultaneous connections for every IP is unlimited But how can we implement this policy when Squid? With which config? UPDATE: artifex and Tom Newton offered using a bandwidth-limiting approach to fight against downloaders. But bandwidth-limiting in Squid has a shortcoming: It's static and cannot dynamically change. So a person has a limited bandwidth not matter how many people are using internet (maybe nobody!) Also, this solution cannot help to stop people from downloading. They still can download but in a lower speed. But if we find a way to terminate persist connections (or any connection that is alive more than a specific time), downloading big files will be almost impossible (always there is some way!)

    Read the article

  • NFS Datastore Appears Empty!

    - by daemonchild
    Hi guys, I've got an NFS server problem. The datastore connected and seems to be a valid datastore in both the vSphere client and under /vmfs/volumes. The issue is that it appears to be empty! I can create files (eg: touch /vmfs/volumes/nfs_common/thefile) and it is correctly written to the nfs store. I can verify this by looking on the nfs server itself. But the vmkernel only sees an empty datastore; the file disappears. Another freebsd box can mount the same NFS share and see the files correctly. Some useful data: ESXi 4.0.0 Build 208167 NFS is unfsd running on a Buffalo Linkstation Pro Duo (a bit hacky I know). The share has file system permissions set to 777 at the moment. My /etc/exports is as follows, and as I say it connects fine. /mnt/array1/ESX_Shared 192.168.16.0/255.255.255.0(insecure,rw,sync,no_root_squash,no_subtree_check) The ESXi servers can also successfully mount NFS shares from other NFS servers. Any ideas guys? Thanks, Tom

    Read the article

  • How important is dual-gigabit lan for a super user's home NAS?

    - by Andrew
    Long story short: I'm building my own home server based on Ubuntu with 4 drives in RAID 10. Its primary purpose will be NAS and backup. Would I be making a terrible mistake by building a NAS Server with a single Gigabit NIC? Long story long: I know the absolute max I can get out of a single Gigabit port is 125MB/s, and I want this NAS to be able to handle up to 6 computers accessing files simultaneously, with up to two of them streaming video. With Ubuntu NIC-bonding and the performance of RAID 10, I can theoretically double my throughput and achieve 250MB/s (ok, not really, but it would be faster). The drives have an average read throughput of 83.87MB/s according to Tom's Hardware. The unit itself will be based on the Chenbro ES34069-BK-180 case. With my current hardware choices, it'll have this motherboard with a Core i3 CPU and 8GB of RAM. Overkill, I know, but this server will be doing other things as well (like transcoding video). Unfortunately, the only Mini-ITX boards I can find with dual-gigabit and 6 SATA ports are Intel Atom-based, and I need more processing power than an Atom has to offer. I would love to find a board with 6 SATA ports and two Gigabit LAN ports that supports a Core i3 CPU. So far, my search has come up empty. Thus, my dilemma. Should I hold out for such a board, go with an Atom-based solution, or stick with my current single-gigabit configuration? I know there are consumer NAS units with just one gigabit interface (probably most of them), but I think I will demand a lot more from my server than the average home user. Any advice is appreciated. Thanks.

    Read the article

  • Sharepoint AD imported users are becomming sporadically corrupted, causing us to have to create a ne

    - by TrevJen
    Sharepoint 2007 MOSS with AD imported users. All servers are 2008. I have around 50 users, over the past 2 months, I have had a handful of the users suddenly unable to login to Sharepoint. When they login, they either get a blank screen or they are repropmted. These users are using accounts that have been used for many months, sometimes the problem originates with a password change. In all cases, the users account works on every other Active Directory authenticated resource (domain, exchange, LDAP). In the most recent case, last night I was forced deleted a user ("John smith") because of corruption. The orifinal account name was jsmith. I deleted him from active directory, then deleted him from the profile list in Sharepoint Shared Services. I could not find a way to delete him from the Sharepoint user list, but I reran the import after recreating his account (renamed it too just to be sure to "smithj"). At first, this did not wor, the user could still access all other resources but Sharepoint. then, some 30 minutes later it inexplicably started working. This morning, the user changed passwords, which immediatly broke the login on Sharepoint again. I am at a loss on how to troubleshoot this.

    Read the article

  • Wake for Network Access Apache servr in OS X 10.8, followup

    - by Gary
    Sorry, I can't seem to post this response within the same thread. Thank you both (Zoredache and Gordon) for your answer. But the fix seems temporary. I entered the command you suggested, and it seemed to work: ...smith$ Registering Service ApacheNoDoz._http._tcp.local port 80 DATE: ---Fri 14 Sep 2012--- 12:04:15.813 ...STARTING... 12:04:16.566 Got a reply for service ApacheNoDoz._http._tcp.local.: Name now registered and active So, I checked for it on my G5: Browsing for _http._tcp Timestamp.....A/R Flags if Domain......Service Type...Instance Name (lots of Bonjour printers omitted)... 12:07:38.370..Add.....2..4 local.......... _http._tcp.........ApacheNoDoz 12:07:45.921..Rmv.....0..4 local..........._http._tcp.........ApacheNoDoz So, it was running at 12:07:38, at which time the host was asleep. But, shortly after, the activity seems to have been removed. I don't know why. Does this mean that I can never let the cpu sleep, or is there something else I have to set? Thanks, again.

    Read the article

  • Boot drive not found issue after cloning using Apricorn EZgig

    - by TomWilsonFL
    A couple days ago I cloned a drive for someone using the EZgig software. Usually this goes without a hitch, but this particular drive I was cloning is quite old. When I restarted with the new drive I received the typical bootable disk not found message, so I turned it off, messed with the BIOS, restarted and it came up fine. That night I was working remotely on the computer and had to restart it. It didn't come back up; not a good sign. When the user came to the computer in the morning it was giving the same message. I have found that to make the computer boot, all I have to do is go into the BIOS and "Load Defaults", then restart. It will boot and runs great. Any thoughts on what is causing this situation? Is it MBR corruption? Are some settings being saved in the CMOS? A couple points of mention: I have already attempted looking for a BIOS update for the computer, but the newest is already installed (from 2003). When the computer reboots it either shows "None" for Primary Master, or sometimes it will just not show anything. Thanks, Tom

    Read the article

  • ExtJS: Combobox in EditorGridPanel not selecting the desired item (with test case)

    - by TomH
    I'm using ExtJS to create an EditorGridPanel with a combobox for an editor in a cell. The combobox in my EditorGridPanel that is not working as I'd expect it to. When the user types the first letter of an item in the drop down list, the combobox seems to ignore it and select the first item in the list. I can reproduce the error consistently and have put together a test case here: http://cluebucket.com/dev/testcase/testcase.html Load the page and reproduce the behavior by the following -- note that this is all done using the keyboard, no mouse clicks: Click 'Add Record' (A new row is added to the grid) enter text in the text field. TAB to the Priority field without selecting anything (None will remain selected) TAB out of the Priority field. (A new row is added to the grid) enter text and TAB to the Priority field TYPE v (Very High is selected) TAB out of the priority field (A new row is added to the grid) enter text and TAB to the Priority field Type v (None is selected, but Very High should have been) TAB out of the priority field Enter text and TAB to the priority field Type l ('el') (Low is selected) TAB out, enter text, TAB to priority Type l (None is selected) It appears that whenever the user attempts to select the same value that was selected in the previous row, the combobox selects None. Any ideas? The code is available at cluebucket.com/dev/testcase/js/testcase.js Thoughts/Pointers/Corrections are appreciated!! thanks tom

    Read the article

  • MVVM Light Toolkit - RelayCommands, DelegateCommands, and ObservableObjects

    - by DanM
    I just started experimenting with Laurent Bugnion's MVVM Light Toolkit. I think I'm going to really like it, but I have a couple questions. Before I get to them, I need to explain where I'm coming from. I currently use a combination of Josh Smith's MVVM Foundation and another project on Codeplex called MVVM Toolkit. I use ObservableObject and Messenger from MVVM Foundation and DelegateCommand and CommandReference from MVVM Toolkit. The only real overlap between MVVM Foundation and MVVM Tookit is that they both have an implementation for ICommand: MVVM Foundation has a RelayCommand and MVVM Tookit has a DelegateCommand. Of these two, DelegateCommand appears to be more sophisticated. It employs a CommandManagerHelper that uses weak references to avoid memory leaks. With that said, a couple questions: Why does MVVM Light Toolkit use RelayCommand rather than DelegateCommand? Is the use of weak references in an ICommand unnecessary or not recommended for some reason? Why is there no ObservableObject in MVVM Light? ObservableObject is basically just the part of ViewModelBase that implements INotifyPropertyChanged, but it's very convenient to have as a separate class because view-models are not the only objects that need to implement INotifyPropertyChanged. For example, let's say you have DataGrid that binds to a list of Person objects. If any of the properties in Person can change while the user is viewing the DataGrid, Person would need to implement INotifyPropertyChanged. (I realize that if Person is auto-generated using something like LinqToSql, it will probably already implement INotifyPropertyChanged, but there are cases where I need to make a view-specific version of entity model objects, say, because I need to include a command to support a button column in a DataGrid.) Thanks.

    Read the article

  • Top-Rated JavaScript Blogs

    - by Andreas Grech
    I am currently trying to find some blogs that talk (almost solely) on the JavaScript Language, and this is due to the fact that most of the time, bloggers with real life experience at work or at home development can explain more clearly and concisely certain quirks and hidden features than most 'Official Language Specifications' Below find a list of blogs that are JavaScript based (will update the list as more answers flow in): DHTML Kitchen, by Garrett Smith Robert's Talk, by Robert Nyman EJohn, by John Resig (of jQuery) Crockford's JavaScript Page, by Douglas Crockford Dean.edwards.name, by Dean Edwards Ajaxian, by various (@Martin) The JavaScript Weblog, by various SitePoint's JavaScript and CSS Page, by various AjaxBlog, by various Eric Lippert's Blog, by Eric Lippert (talks about JScript and JScript.Net) Web Bug Track, by various (@scunliffe) The Strange Zen Of JavaScript , by Scott Andrew Alex Russell (of Dojo) (@Eran Galperin) Ariel Flesler (@Eran Galperin) Nihilogic, by Jacob Seidelin (@llimllib) Peter's Blog, by Peter Michaux (@Borgar) Flagrant Badassery, by Steve Levithan (@Borgar) ./with Imagination, by Dustin Diaz (@Borgar) HedgerWow (@Borgar) Dreaming in Javascript, by Nosredna spudly.shuoink.com, by Stephen Sorensen Yahoo! User Interface Blog, by various (@Borgar) remy sharp's b:log, by Remy Sharp (@Borgar) JScript Blog, by the JScript Team (@Borgar) Dmitry Baranovskiy’s Web Log, by Dmitry Baranovskiy James Padolsey's Blog (@Kenny Eliasson) Perfection Kills; Exploring JavaScript by example, by Juriy Zaytsev DailyJS (@Ric) NCZOnline (@Kenny Eliasson), by Nicholas C. Zakas Which top-rated blogs am I currently missing from the above list, that you think should be imperative to any JavaScript developer to read (and follow) concurrently?

    Read the article

  • WPF relaycommand from usercontrol

    - by pilsdumps
    Hi, I'm new to WPF and in the spirit of trying to do things the correct way have tried to implement MVVM in my application. I've made use of the frequently mentioned article by Josh Smith, and apart from making me realise how little I know, it has left me a little stumped. Specifically, I have a page that uses the RelayCommand object to handle a button directly on the page and this is fine. However, the button (save) will ultimately be on a user control that will also contain other buttons and the control will be used on a number of pages. My question is this; how do I relay the command from the user control to the page (ie viewmodel) containing it? If I bind to the command public ICommand SaveCommand { get { if (_saveCommand == null) { _saveCommand = new RelayCommand( param => this.Save(), param => this.CanSave ); } return _saveCommand; } } on the user control, I would need to use a Save method on the user control itself, when in fact I should be handling it on the viewmodel. Can anyone help?

    Read the article

  • Binding update adds news series to WPF Toolkit chart (instead of replacing/updating series)

    - by Mal Ross
    I'm currently recoding a bar chart in my app to make use of the Chart class in the WPF Toolkit. Using MVVM, I'm binding the ItemsSource of a ColumnSeries in my chart to a property on my viewmodel. Here's the relevant XAML: <charting:Chart> <charting:ColumnSeries ItemsSource="{Binding ScoreDistribution.ClassScores}" IndependentValuePath="ClassName" DependentValuePath="Score"/> </charting:Chart> And the property on the viewmodel: // NB: viewmodel derived from Josh Smith's BindableObject public class ExamResultsViewModel : BindableObject { // ... private ScoreDistributionByClass _scoreDistribution; public ScoreDistributionByClass ScoreDistribution { get { return _scoreDistribution; } set { if (_scoreDistribution == value) { return; } _scoreDistribution = value; RaisePropertyChanged(() => ScoreDistribution); } } However, when I update the ScoreDistribution property (by setting it to a new ScoreDistribution object), the chart gets an additional series (based on the new ScoreDistribution) as well as keeping the original series (based on the previous ScoreDistribution). To illustrate this, here are a couple of screenshots showing the chart before an update (with a single data point in ScoreDistribution.ClassScores) and after it (now with 3 data points in ScoreDistribution.ClassScores): Now, I realise there are other ways I could be doing this (e.g. changing the contents of the original ScoreDistribution object rather than replacing it entirely), but I don't understand why it's going wrong in its current form. Can anyone help?

    Read the article

  • DataContractSerializer does not properly deserialize, values for methods in object are missing

    - by sachin
    My SomeClass [Serializable] [DataContract(Namespace = "")] public class SomeClass { [DataMember] public string FirstName { get; set; } [DataMember] public string LastName { get; set; } [DataMember] private IDictionary<long, string> customValues; public IDictionary<long, string> CustomValues { get { return customValues; } set { customValues = value; } } } My XML File: <?xml version="1.0" encoding="UTF-8"?> <SomeClass> <FirstName>John</FirstName> <LastName>Smith</LastName> <CustomValues> <Value1>One</Value1> <Value2>Two</Value2> </CustomValues > </SomeClass> But my problem is for the class, i am only getting some of the data for my methods when i deserialize. var xmlRoot = XElement.Load(new StreamReader( filterContext.HttpContext.Request.InputStream, filterContext.HttpContext.Request.ContentEncoding)); XmlDictionaryReader reader = XmlDictionaryReader.CreateDictionaryReader(xmlRoot.CreateReader()); DataContractSerializer ser = new DataContractSerializer(typeof(SomeClass)); //Deserialize the data and read it from the instance. SomeClass someClass = (SomeClass)ser.ReadObject(reader, true); So when I check "someClass", FirstName will have the value john, But the LastName will be null. Mystery is how can i get some of the data and not all of the data for the class. So DataContractSerializer is not pulling up all the data from xml when deserializing. Am i doing something wrong. Any help is appreciated. Thanks in advance. Let me know if anyone has the same problem or any one has solution

    Read the article

  • Handling multiple column data with Java

    - by Ender
    I am writing an application that reads in a large number of basic user details in the following format; once read in it then allows the user to search for a user's details using their email: NAME ROLE EMAIL --------------------------------------------------- Joe Bloggs Manager [email protected] John Smith Consultant [email protected] Alan Wright Tester [email protected] ... The problem I am suffering is that I need to store a large number of details of all people that have worked at the company. The file containing these details will be written on a yearly basis simply for reporting purposes, but the program will need to be able to access these details quickly. The way I aim to access these files is to have a program that asks the user for the name of the unique email of the member of staff and for the program to then return the name and the role from that line of the file. I've played around with text files, but am struggling with how I would handle multiple columns of data when it comes to searching this large file. What is the best format to store such data in? A text file? XML? The size doesn't bother me, but I'd like to be able to search it as quickly as possible. The file will need to contain a lot of entries, probably over the 10K mark over time.

    Read the article

  • solr JOIN query

    - by Sfairas
    I need to run a JOIN query on a solr index. I've got two xmls that I have indexed, person.xml and subject.xml. Person: <doc> <field name="id">P39126</field> <field name="family">Smith</field> <field name="given">John</field> <field name="subject">S1276</field> <field name="subject">S1312</field> </doc> Subject: <doc> <field name="id">S1276</field> <field name="topic">Abnormalities, Human</field> </doc> I need to only display information from the person doc but each query should match fields in both person and subject. In the case the query matches only the subject doc I need to display all docs from the person that have a matching id. Is this possible to do without running two seperate queries? Something like a JOIN query would do the job. Any help?

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >