William Ryan, W.G. Ryan, Bill Ryan, Kim Ryan, Search Engine Optimization, Dynamics CRM 2011, R Programming, Python Programming, Python NLTK, Natural Language Processing

 12 Dec 2014 @ 3:47 PM 

There’s certainly been a lot of news recently about police abuses.  But let’s be honest, cops aren’t the only ones who love to abuse their power.  I mean, Federal Agents engage in everything from using govt resources to hassle people they got dumped for to using family court proceedings in an attempt to criminalize personal animosity to behavior that gets you sued form the Southern Poverty Law Center (yah, the same place that tracks groups like the Klu Klux Klan). When not tormenting helpless immigrants, we have federal agents that try to codify blasphemy commandments into family count settlements.  The mere mention of an Agents name is enough to give some of them a case of the vapors.  It’s quite logical if you think about it – Mention me as part of a law suit showing the world I abused my position and abused helpless immigrants – OK -allow a spam comment with my name in it to sit on your site for a few days – not ok.  B/c the latter puts families at risk and exposes identities and puts people’s jobs at risk (one might wonder, if your name appearing in a comment is enough to threaten your job, maybe the fix is being a little better employee.  You see, look  at any of the following:

  • William Ryan
  • William G Ryan,
  • William G. Ryan IV
  • Bill Ryan == William Ryan
  • Bill Ryan
  • William Ryan
  • Bill Ryan, Duncan, SC
  • Bill Ryan, Duncan, SC 29334
  • Bill Ryan, Bill Ryan, Bill Ryan
  • Bill Ryan, Carl Ryan, Mike Jones
  • Bill Ryan is a terrible person. Hes ugly, his mother dresses him funny and he’s just terrible
  • Bill Ryan is a terrible person who dies terrible things like laughing at people he doesn’t like.

You get the idea.  I’m going to make Bill Ryan the preferred search term of this post just for good measure and have a few people link back to it . In the process, I’ve put my job at risk exactly 0 times. I just showed c aopy of this post to the president of my company and asked if he thinks less of me now. He laughed, asked me “What the hell are you up to now” and said no.  Not only didn’t this compromise my job, it didnt’ compromise my safety, the safety of those around me or the safety of my loved ones.  But it’s different when you’re a federal agent apparently.  Even one mention (provided it’s on some offshoot technical blog – if it’s on a mainstream site linked from the Southern Poverty Law Center, it’s a totally different matter) is enough to put them at serious risk of  job loss and put not only their lives, but the lives of their loved ones (the same loved ones they say “I don’t care what you say about ______just dont’ write about me” about) in serious risk.  Putting the front page of Google results in one post to counter an accusations made against you, if it includes said Agent’s name, it puts their lives in jeopardy as well.

So imagine what video tape would do?  If mere text is enough to throw LE members into a panic, then having video of them must be , like really really bad.  Sadly, in Maryland, Officers don’t have standing automatically in family court so they have to use other means of squashing people from exposing their misdeeds.  In another state, they’d just drag Kianga Mwamba into court, claiming that it somehow violated a family court order , they’d manufacture some Google Cache images a little lying and viola, they’d be well on their way to getting a settlement demanding no one speak of their misdeeds. But they’re not in that state, so they have to resort to more traditional means – beat the person up, lye about them, arrest them for resisting, and then destroying the evidence.

The Supreme Court ruled that we have a right to film police.  Section 230 of the Communications Decency Act spells out that fact that people aren’t responsible for that people post on their site.  But Laws are for the little people, not Cops and Certainly not federal agents.

Shockingly, I posted this about 20 minutes ago and I still haven’t been fired, demoted or anything. Nor has anything bad happened to me.  How can that be?  A common acquaintence said “You know, she feels the need to control people so she’s very quick to try to force others to do her bidding.  ”  He went on to tell me how this person is trying to do ANYTHING said person can to get even – indeed, using a serious case of extreme butthurt and  incipient animosity to try to criminalize behavior.  That’s one way to put it.

One thing is for sure, those in law enforcement frequently feel like the law doesn’t apply to them.    Kianga Mwamba has recourse against abusive officers and so do all of us- thank God for the inspector general’s office . Standing up to these people is the only thing that works. Yes they’ll get really mad when you stand up to them. They’ll lie. They might even resort to making up a bunch of things.  But just remember, there’s more than one venue in this country, and you have legal recourse. Whether you choose to use it or not is up to you.  After 4 long years of this crap, it’s finally coming to an end where I can handle this in an appropriate legal venue – I can’t wait.

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted By: admin
Last Edit: 12 Dec 2014 @ 03:47 PM

EmailPermalinkComments (0)
Tags

 09 Dec 2014 @ 11:34 PM 

I just came across Mark Bennett’s rules for dealing with a certain type of people.  The only thing I’d add is the title should be “10 Practical Rules for Dealing with Borderline Personalities and the pathetic loser she’s married to” but other than that, it’s spot on.  The fact that an Attorney with the last name of Bennett writes this just goes to show how absolutely perfect the universe really is but I better stop here before someone calls the Online Butthurt police on complaining “Bill’s laughing at us again.”

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags:

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted By: admin
Last Edit: 09 Dec 2014 @ 11:34 PM

EmailPermalinkComments (0)
Tags

 18 Aug 2014 @ 7:12 PM 

This content is password protected. To view it please enter your password below:

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted By: admin
Last Edit: 31 Aug 2014 @ 10:29 PM

EmailPermalinkComments Off
Tags
Categories: Abuse of Power, ICE Abuses

 14 May 2014 @ 4:25 PM 

DogeVault.com hacked

DogeVault.com hacked was the order of the day Saturday, May 11 2014.  Such Hack, Much Sad ;-(  If you are active in the crypto-currency market, you no doubt have heard of Mt.gox and what happened to them. If you aren’t, the tl;dr version is that they were ostensibly the biggest player in the BitCoin trading market, and they got hacked. It was so bad that they ended up filing for bankruptcy.  Imagine going from running a Magic the Gathering Web site, to being worth millions, to bankrupt in very short order. It must have been rough.  The layperson version of the hack was that an attempt was made to transfer some coins, but the response was faked to indicate that the transfer failed when in fact it was successful.  Being stand up guys, they automatically retried the transaction and got the same failure message in spite of another successful transfer.  This process repeated itself until, well, things ended badly for Mt.Gox.

Online or Cloud based services hosting crypto-currency wallets are a huge target for hackers.  Hitting a traditional bank for a few hundred million would be pretty hard to pull off and even harder to get away with (although it’s too early to claim anyone got away with anything).  But crypto-currency isn’t something most people know about or are involved in, so it’s the biggest thing you never heard for for many folks.  Although DogeVault.com was diligent in trying to keep things secure, there was ultimately a breach and at the moment, there’s not a whole lot being acknowledged about it. If you go to DogeVault.com right now, here’s what you see:

DogeVault.com downtime message

DogeVault.com Downtime Message

 

As of Saturday when it first happened, they had a similar message which didn’t say much but sounded bad. Finally they came forward with this.  Damages and losses aren’t currently known but hopefully they caught it early enough to stop any serious damage.   As a general security note, there’s a bit if wisdom to be extracted here though – outside of not trusting everything to the cloud blindly, if you are hacked, suspect a hack, think you are infected with a virus or malware, the smartest thing to do is pull the network cable and turn off the computer.  Disconnecting from the network isn’t guaranteed insulation from damage, far from it, but it’s the best thing you can do to minimize damages in just about every case.

I was lucky this time – I had just moved the vast majority of my holdings out of DogeVault.com early Saturday morning during a bout of insomnia.  Ironically, I tried to read to go to bed and flipped to a book I had on crypto-currency. In it, the author strongly admonished people to minimize use of online wallets and to move coins into local wallets (while encrypting them and backing them up, ideally including an additional offsite backup location).  Since I still wasn’t able to fall asleep, i decided to go look at my miners and see how they were doing.  And while doing it, I moved all but 200k of my coins to an offline wallet – i did this just a few hours before the attack started.  Every book I’ve read offers this advice, it’s mentioned on just about every forum you come across, but it’s easy to feel a false sense of security.

while it’s true that someone can come in your house and steal your computer much easier than hackers can hit a place like DogeVault.com your house is a lot lower profile.  If someone did break into your house, your computer might not even make the list of things they stole.  Encrypting your wallet would keep thieves from using it. As long as you had a backup you’d be fine. Yes, there’s also risk of fire, drive failures etc, but those can be mitigated by regular backups and including one that’s offsite. Coupled with encryption, your exposure to risk is pretty small if you perform all of these regularly.

Here’s info about what happened a while back.

 

Keywords: DogeVault, Doge Vault, DogeVault.com, DogeCoin, DogeVault Hacked, DogeVault.com hacked, BitCoin, Crypto-currency, Cryptocurrency, Mt.Gox, Such Hack, Much Sad, DogeCoin Mining, DogeCoin Wallet, William Ryan, W.G. Ryan

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: , , , , , , , , , , , , , , ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 11 May 2014 @ 2:19 AM 

DogeCoin Mining

DogeCoin Mining officially commenced today at the Ryan household.  At least in terms of my solo effort it has.

Background:

DogeCoin mining is my latest hobby. Well, technically it’s more precise to say AltCoin mining has been my latest hobby, but I’m focused mainly on DogeCoin (at least until i can grab lunch with Chris G Williams and he can show me how I can do mine BitCoin in the cloud) . I’ve been blessed with some wonderful clients and at things couldn’t be going better.  I started out contracting as a CRM Developer with a long term goal of getting the company up and running with a A-List Business Intelligence program.  Within a few months, things progressed so well I’m now the Director of Analytics and Lead Data Scientist there.  Outside of having my dream job handed to me, I have the pleasure of working with several phenomenally cool people.  Luckily, many aren’t just cool they’re uber smart.

A few weeks ago, one of the other department heads mentioned the fact he was learning about Crypto-currency in his spare time.  I’ve been dabbling in BitCoin for a few years now, but my interest was mainly wonkish as opposed to commercial.  Computer Security has been the focus of much of my professional work as well as a private passion.  Strong cryptography has been a fascination of mine since the earliest days of grad school when I first came across Bruce Schneier’s  masterpiece, Applied Cryptography.  Bruce Schneier and Phillip Zimmerman were among the first two great thinkers I encountered when I got into computing professionally and they absolutely ignited a true passion for the art within me.  Computerized Cryptography, a new currency, I couldn’t help but fall in love with the concept of BitCoin  from the second I heard of it.

For me, the problem was that I started a little late to the game and had a lot of other things I was focused on at the time, so I didn’t start BitCoin  mining when I should have and I essentially missed the boat. The closest i could get was buying and trading them (although my friend Chris G Williams tells me there’s still ample opportunity to engage in cloud mining of BitCoin).

Anyway, a friend at work decided to start a pool. He decided to open up a pool to investors and we’d buy a percentage of some mining equipment. We’d mine, and if successful we’d be rewarded with whatever percentage of the take that we invested in originally.   JB, the brains behind the initiative is a big thinker  and has many ideas and we’re still very much in our infancy as far as that goes.  Being that DogeCoin  was still a pretty new commodity, I thought it would be the perfect coin to mine on my own as well.  I started reading up on what are commonly known as AltCoins and quickly got up to speed on the entire subject.   The primary allure for BitCoin was not as a primary investment vehicle but that’s certainly been what’s driven much of its popularity.  Now, there’s an entire AltCoin market and the investment aspect is no small part of it.

Anyway, I started looking to buy DogeCoin and LiteCoin on Craigslist like I had in the past with BitCoin.  At the time, I was traveling to New York on a regular basis so it was quite easy to find people buying and selling BitCoin.  The Craigslist strategy ended up in a lot of Fail this time around though as seemingly everyone I responded to was out of state and insisted on doing things in a manner that can be best described as suspicious.  A friend recommended Ebay  and that seemed like a pretty safe bet. The only problem was, I wanted to buy a LOT of DogeCoin and most of the sellers had been ripped off so badly, they only wanted to deal in small transaction. After some digging and just good luck, I ran into a seller (who happened to be a graduate student with a background similar to mine) who had plenty to sell. [As a slight digression, I’ve done several transactions with him and found him to be the ideal seller. I’ve found a few others who I’d recommend but none as highly as this individual. He’s honest, fair, responds to all inquiries quickly and always does what he says he’ll do, and he does so quickly).

I was on the verge of buying a Rig for myself when JB told me there were new super miners coming out in August and that we were on the short list to get the first shipment of them.  August however was a long time away.  I spec’d out a rig with 4 GPU cards that ran around 2k and well , more on that in a bit.  I saw some people on Ebay offering time slices on their mining rigs and decided to give that a try. Unfortunately , the ones I ran into with one exception were so paranoid it was impossible to do business with them. I found one excellent vendor who I still am working with who’s been excellent. Not only has he always done what he says he’ll do, he’s very open to explaining the process and helping out.   If you’re interested in dipping your toe in without buying a rig, my experience with CompandTech was excellent  – so much so I am buying a few weeks worth of time from him.  I’m sure there are plenty of other good providers out there, but the ones I ran into were all so jaded from being burned that it made them hard to deal with.  Steve of CompandTech is just exactly the type of guy you want to deal with on something like this.

DogeCoin Mining

DogeCoin Image

While waiting on my heavy artillery to come in, I came across this.    $110.00 for a starter kit? Sure, it didn’t have a lot of muscle, but how could you go wrong? Seemed like a perfect way to get things rolling. It’s cheap, if it didn’t work you’re only out $150.00 with shipping and he had excellent feedback. So I decided to give it a try.  My other stuff  won’t be here for another few weeks  and the real super powered stuff won’t be here until August.  All of that costs 2k on the low end, 7-10k for the actual stuff we’re going to use so even if the rig failed to pay for itself, I figured it’d be good for learning.

After ordering it, I got notice it was shipped immediately after I paid for it.  The delivery time was supposed to be about a 8 days, but it arrived in 4.  It was packaged with care, everything that was supposed to be in it was and the seller was kind enough to include instructions.  Other than two trivial typos (he references a directory named “Download” which doesn’t exist, the name is “Downloads”) the instructions he includes provides everything you need to get up and running.  I pointed out the typos but let me be clear, I’m not criticizing them, just the opposite. The fact the only thing wrong with his instructions are utterly trivial remnants of auto-correct speak volumes to how well things were done overall. Trust me, if this is the worst thing you can find to complain about, you are doing very well.

Unpacking the box took me about 10 minutes. Most of that time included cleaning up the styrofoam peanuts he included to make sure the package arrived safely and intact.  I took it into the living room, grabbed one of my non-HD monitors, plugged in the provided keyboard and mouse, hooked it into my switch and I was 90% of the way there.  The Box didn’t seem to come with a wireless card (at that price, you really have to be a jerk to expect anything like that) so I hooked it into my switch and just followed his instructions. I already have a few pools setup but I went ahead and setup an account with the same pool he used.  Everything worked perfectly, without even a small glitch.  From unpacking to live mining, the whole process took about 20 minutes and most of that was just plugging in cables , booting and setting up my accounts.  His instructions covered everything and they were both thorough and accurate.

He mentions that he has a lower end card installed by default, but that you can easily upgrade the card if you like.  I’m a little impatient so I’m heading over to one of the 24hr Wal-mart’s to pick up an upgraded card. I’m pretty sure i’m going to end up using the 4 card configuration for the time being until my other stuff arrives and I’ll report back tomorrow with some statistics.

All I can say is that the work it took to actually get up and mining was but a fraction of the work it takes to simply buy the things outright on Ebay, Amazon or Craigslist (Craigslist varies greatly from area to area and I don’t want to dissuade you from considering it – you may end up finding great deals there or find an opportunity to trade things you don’t want for a rig – I’m just saying from my experience, between CL in the greater Detroit area as well as the SC Upstate area, it’s probably not with the hassle).  If you want to go the Ebay or Amazon route by all means look around – however if you use any of the vendors I mention above, I can tell you I’ve performed several transactions with each of them and have found them to be extremely honest, easy to deal with, helpful and competent. So much so, that even after I have all my mining rigs in full swing, I’ll continue to do business with them as I’ve found them wonderful to deal with.

Tomorrow morning, I’ll start posting the results and do my best to show the daily progress. I don’t expect anything too dramatic at the moment until I upgrade the video card (but remember, we’re talking about a $100.00 rig here. That’s a fraction of the cost of any hard core rig you’d contemplate and you get everything promised and then some.  Right now, I’ve got almost 10k or machinery on order outside of my upcoming investment in the work pool.

My friend Chris G Williams , who’s much more knowledgeable than I am on the subject, says that he doesn’t think DogeCoin has the ‘oomph’ to make it worth while (and at first glance, it does seem to be more  ‘cute’ than anything else. He’s figured out how to use the could to do all of his mining and hey, if you can still mine BitCoin and make money at it, you must be doing something right.  I have a feeling i’ll be joining him in the cloud, although I think i’ll be sticking with both LiteCoin and DogeCoin for the time being.

If you have no idea what Cryptocurrencies are, if you have no idea why they matter, if you have heard of them and want to learn more, if you want to learn how you can start mining, if you want to learn to how to store your wealth in non-traditional currencies or securities of if you just want a  little more background on the subject, you should find the links below helpful.  And yes, if you’re wondering, I have been putting almost all of my extra money into crypto-currencies.  If DogeCoin were to simply increase to $0.01, Kim and I will be taking an early retirement and we’ll be able to put Sarah through 4 years at Duke without even working any overtime to do so.

 

Stay tuned:

Keywords:  DogeCoin, BitCoin, AltCoin, LiteCoin, Cryptocurrency, Crypto-Currency, Alternative Currency, Mining Pool, Mining Righs, DogeVault, CompadTech, Craigslist, Cryptsy, DogeVault, DogeCoin, Homestar Runner, StrongBad, Atsuko Sato, Shiba Inu, Kabuso, 4Chan, /b/, Murica, William Ryan, William G Ryan, William Ryan Director of Analytics, William Ryan Lead Data Scientist, Data Science.

 

 

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: , , , , , , , , , , , , ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 18 Apr 2014 @ 12:34 AM 

Unit Testing MSCRM 2011 – Part II of V

Unit Testing MSCRM 2011 is something that’s often overlooked in MSCRM development.  In Unit Testing MSCRM 2011 – Part I of V, I introduced the basics and background needed to get started with Unit Tests.  The more I’ve thought about it, the more I’ve realized that an entire release pipeline and development schedule should be applied to MSCRM development, just like any other .NET Project or project using another technology would be.  In any case, let’s move on to the next piece.

Review

Right now, we went through getting a Unit Test project in to your solution, and using the  AssemblyInitialize , AssemblyCleanup, ClassInitialize , ClassCleanup , TestInitialize and TestCleanup attributes to seed your data and clean it up afterward.  Remember that one of the fundamental goals we’re trying to achieve is to have a project that can be run in multiple environments as your code progresses from development to release.  Assume that you just finished a sprint (or whatever cycle you use) and are ready to move the code from Development over to Test or QA.  You’d first package up your MSCRM Solution (we’ll cover automated deployment of MSCRM Solutions later in this series) and deploy it first.  This will have all of your schema changes and artifacts that were just built.  We will have to deal with testing Plugins and Workflows in a separate article but don’t worry, we will be covering those even if we’re ignoring them at the moment. Once you’ve published the solution, the ideal situation is to deploy your test project, run it against the new environment, and verify that all of your tests passed.  This would do several things, the least of which is verify that the new solution changes you made effectively ‘took’.

Assumptions

Before proceeding, the code sample makes the following assumptions:

  1. There is a valid CrmConnection which has its ConnectionString property stored in the <connectionStrings> section of the app.config file.  The Name property is CrmMain.
  2. There is a valid OrganizationService instance named OrgServiceInstance that uses the CrmConnection referenced in the previous step.
  3. There have been several Entities loaded from an Excel file (Accounts, Contacts, Opportunities, Terms, States).
  4. The data was loaded in the ClassInitialize method.  For each entity, there is a collection that contains the Guid of each newly created record.  This Guid, along with the entity type name is used in the ClassCleanup method to make sure all newly created records are removed.  This helps to ensure that the environment is left in essentially the same condition that it was in originally.
  5. The TestContext is referenced in a property called CurrentTestContext .  This contains information loaded from Excel.  For illustrative purposes, there are also collections List<Entity> that hold copies of each of the records created from Excel. This is redundant, but it makes it easier to follow and I’ll explain how to use just one of them later on in the article.
  6. The code is written for readability and is not production code.  If you copy and paste it to perform an operation, please take needed steps to make it production ready.
  7. The code uses the so called Late Bound approach in MSCRM.  The examples would work pretty much identically if the early bound or OrganizationServiceContext approach was used.

 

TestMethod

The TestMethodAttribute is what’s used to identify a given method as a Test method that should be managed and executed by MSTest. Without this attribute, you simply have a method.  The class that contains the method must be annotated with the TestClass attribute. Additionally, the TestMethod attribute must decorate a method, not an Event, a Property or a Constructor. If you need to replicate this functionality, you can do so with a workaround but it probably isn’t advisable.

It is necessary to have a TestMethod attribute on each method that is participating in the test run.  Optionally, you can use the TestCategory attribute to help define some more information about the Test.  TestCategory isn’t required, but I find it beneficial.  For instance, if I was writing a test to verify a method that Queried a Contact entity, I would use two TestCategory attributes, one for Contact and one for Query.  The following test verifies that a QueryExpression to identify a contact based on the Social Security number works.  In practice, this code block would likely be part of another class and we’d simply call the method. For the sake of clarity, I’m showing the code itself.

[TestMethod]
[TestCategory("Contact")]
[TestCategory("Query")]
public void GetContactIdBySSN()
{
ConditionExpression PrimaryCondition = new ConditionExpression();
PrimaryCondition.AttributeName = "cuckoo_memberssn";
PrimaryCondition.Operator = ConditionOperator.Equal;
PrimaryCondition.Values.Add(CurrentTestContext .Properties["ContactSSN"].ToString());

FilterExpression PrimaryFilter = new FilterExpression ();
PrimaryFilter.Conditions.Add(PrimaryCondition);

   QueryExpression Query = new QueryExpression ("contact");
Query.ColumnSet.AddColumns("contactid");
Query.Criteria.AddFilter(PrimaryFilter);
// Realworld would have exception handlers
EntityCollection QueryResults = OrgServiceInstance.RetrieveMultiple(Query);
Assert.IsNotNull(QueryResults);

}

So this is a pretty simple example of using a QueryExpression.  It simply defines a query for a contact that searches for the cuckoo_memberssn property matching a value we have stored in the CurrentTestContext property’s ContactSSN property. This value could be hard coded, it could be stored in a configuration file or a settings file, it could be retrieved pretty much anywhere.  However we loaded this information from Excel so it makes sense to have it defined dynamically as we’ve done here.

When this test runs, the method defined with the AssemblyInitialize attribute will run first, then the ClassInitialize, then the TestInitialize .  The corresponding Cleanup methods (AssemblyCleanup, ClassCleanup  & TestCleanup ) will be run at the end of each series.  In this case, we’re simply trying to determine whether or not there’s a Contact record with a value that matches the ‘ContactSSN’. If there’s a match, we assume our logic worked.  To that end, we use the Assert class’ IsNotNull method, passing in the results of the RetrieveMultiple request we made previously. For the pedantic out there, there are probably 25 ways to accomplish the same test and several of them would be better practice programming wise. The point here is to try to illustrate how to test code though, so that’s why RetreiveMultiple is used for instance.

If this test passes, MSTest will indicate a success (aka. ‘Greenlight’), if it fails it will indicate a failure, or if there is no Assert definition stated, it would come back indeterminate.   After this test ran and all the other ones defined in the class run, the ClassCleanup method would run deleting each of the contacts were created.

Data Driven Tests

One could store a list of Social Security Numbers for the Contact and test them one at a time. This would make sense and since the data was just created, they should all match.    We would also want to test a few other scenarios.  For instance, each of the following should be tested:

  1. That SSN parameter values that are null or have a value of String.Empty should throw an ArgumentNullException (or equivalent validation exception).
  2. Assuming the business logic dictates it, sequences of 9 numbers should work.  If there are hyphens that separate the tokens, they should work. Periods should work.  Spaces should work. However if any of these characters appear in an incorrect index, it should probably not work (again, this may or may not correspond with a real world scenario, some companies may not want any formatting characters. Others might just strip out the characters. Hopefully however you get the idea.
  3. Sequences that don’t exist should intentionally fail.

Figure 1-1 shows an Excel worksheet with a list of Social Security numbers.  In practice, these would correspond to values we knew existed (or in the case of negation, ones we were sure did not exist).

Figure 1-1List of Contact Social Security Numbers

Example of using MSTest in conjunction with the TestClass, TestMethod and DataSource attribute

Example of using MSTest in conjunction with the TestClass, TestMethod and DataSource attribute

So for this test, we want to have MSTest automatically walk through each value and run the test for each listed value.  To do that , we add the DataSourceAttribute and set a few values.

  1. The first argument is the Provider type.  You can use other file formats, for instance, comma separated values, in this case I’m using Excel 2007 (there’s a slightly different connection string for later versions of excel but that’s pretty much irrelevant)
  2. Next is the Dsn specification which should contain a value of Excel Files (semi-colon characters are used to delimit the various tokens).  It needs a dbq specification along with the filename. All three of these should be delimited with a semi-colon and are part of the same parameter.
  3. Next you specify the Sheet name.  Note that whatever name is on the sheet is the name you should use here. Excel has some rules about allowable values so to keep it simple, I’d recommend using simple alphanumeric characters without spaces or special characters (even though you can do otherwise).  Make sure however that the sheet name has a dollar sign at the end of it. So if the Excel sheet name was Contact, a value of Contact$ should be used. If the sheet name was Account, a value of Account$ should be used.
  4. The last parameter, the DataAccessMethod is optional. It is an enumerated value that lets you specify either Sequential or Random.

Here’s an example of what a definition looks like:

[TestMethod]
[TestCategory("Contact")]
[TestCategory("Query")]
[DataSource("System.Data.Odbc", "Dsn=Excel Files;dbq=|DataDirectory|C:\\ContactSSNs.xls", "ContactSSN$", DataAccessMethod.Sequential)]
public void GetContactIdBySSN()
{}

When you run this test, the behavior may be somewhat counterintuitive. You might be inclined to think the test would just execute once, like the non data-driven counterparts do. However that’s not the case.  The test will execute for each value it finds in the excel sheet. Figure 1-1 is truncated at the bottom , but there are at total of 24 values.  So this test will be run 24 times using each subsequent value as the source parameter.  You reference the current context using the TestContext class, referencing the DataRow property. Just like you would with a System.Data.DataTable , you reference the individual row with either a column name or an index.  When you look at the implementation, it becomes pretty clear that the scenario is likely implemented using a DataAdatper, calling the Fill method and then using the resulting DataTable to execute the tests with (however that’s neither here nor there). When I get a chance, I’ll open up the library using Telerik’s Just DeCompile and verify that theory. In any case, here’s what the new version of the test would look like:
[TestMethod]
[TestCategory("Contact")]
[TestCategory("Query")]
[DataSource("System.Data.Odbc", "Dsn=Excel Files;dbq=|DataDirectory|C:\\ContactSSNs.xls", "ContactSSN$", DataAccessMethod.Sequential)]
public void GetContactIdBySSN()
{
ConditionExpression PrimaryCondition = new ConditionExpression();
PrimaryCondition.AttributeName = "cuckoo_memberssn";
PrimaryCondition.Operator = ConditionOperator.Equal;
PrimaryCondition.Values.Add(TestContext .CurrentRow["ContactSNN"].ToString());

FilterExpression PrimaryFilter = new FilterExpression();
PrimaryFilter.Conditions.Add(PrimaryCondition);QueryExpression Query = new QueryExpression(“contact”);
Query.ColumnSet.AddColumns(“contactid”);
Query.Criteria.AddFilter(PrimaryFilter);
// Realworld would have exception handlers
EntityCollection QueryResults = OrgServiceInstance.RetrieveMultiple(Query);
Assert.IsNotNull(QueryResults);
}

If each of the SSN’s in the Excel file match a given contact in the Crm instance referenced by with the CrmConnection, then each pass should return a instantiated Entity value. Since it’s not null, the Assertion should pass and viola.  I randomly chose 24 records but it could just as easily be 100 or 1000.  To test the affirmation, it only makes sense to test values that you just added so you know that they are actually there. However you could make sure that the values weren’t valid SSNs and you could verify that the tests fail. You could purposely make sure that the SSN was in an invalid format and use the ExpectedException attribute to make sure that validation was working properly There’s quite a bit you could use to test here. And since all the heavy lifting is done by MSTest, once you get setup in place, the rest of the work is pretty trivial. Running the test 100 times is scarcely less difficult than running it 1000 times.

 

Conclusion:

So far, we’ve built a Test Project and started creating some basic test methods. Using the TestClass and TestMethod attribute, we’ve been able to create a simple test to verify a query for a Social Security Number in a CRM Contact works. We’ve augmented the TestMethod with the TestCategory attribute which allows us to organize and run tests together.  We then started using the DataSource attribute to allow us to run Data Driven Tests.  in this example, we only used the Sequential value of the DataAccessMethod enumeration, but we could just as easily have used the Random value and in many cases, it would be a better choice. In CRM, because tables are so heavily linked together, it’s often necessary to create data in a specific order so that the Guid identifier for the record can be easily recorded and used to build the subsequent records.  Once the data is created (which we argued makes most sense to do in either the AssemblyInitialize , ClassInitialize or TestInitializemethods), accessing it randomly probably makes a lot of sense in most cases.  The ultimate objectives so far is to be able to thoroughly test code blocks and do so in a manner that lets us dynamically determine what we want to test.  By using a data source that’s external (in this case, Excel) we have the ability to change what we test.  Ultimately, this lets us push out a build, run the test suite which creates a good bit of test data, run several tests against it than delete the data.  While this may seem less than spectacular, just think of how long it would take you to create 5 Accounts and 10 total contacts associated to those Accounts.  Microsoft Dynamics CRM is frequently described as “Clicky” – one of the main criticisms I’ve come across with it so creating test data can be time consuming. Sure, you can create scribe jobs or use Excel sheets to import data, but that still leaves you responsible for cleaning up the data and it’s awkward. Instead, you can use TFS Automation to push out a new solution, publish it, run a set of data driven tests that could easily span several thousand tests (affirmative and negative) and then clean itself up. This is a huge benefit and would greatly benefit any development organization. Let’s face it, a lot of CRM development projects either fail or run much longer than they were planned to.  Anything that adds quality and provides a means of measuring quality and progress should be considered.  Seeing how little effort it takes to employ data driven tests, I’d highly recommend it for any project. Additionally, by using Excel or another similar data source, it’s very easy for non-developers to create test plans and test projects which can be run and verified.  Excel is a very well known product so there’s very little learning curve associated with using it.  If you don’t want to unit test, or you want to run tests one at a time, that’s your prerogative. But keep in mind that adding dynamic tests that can test items thousands of times over takes such little extra work, the question isn’t ‘why should we do this’ but ‘why aren’t we?”

In the next piece, we’ll cover automated deployment of MSCRM solutions and then running the unit tests as part of he build.

 

KeyWords: Unit Testing MSCRM 2011 , MSTest,   CrmConnection, ExpectedException , QueryExpression, FilterExpression, OrganizationService, IOrganizationService, EntityCollection, ColumnSet, DataAccessMethod.Sequential, DataAccessMethod.Random, DataSource, TestContext , TestCategory, TestMethod, ArgumentNullException , AssemblyInitialize , AssemblyCleanup, ClassInitialize , ClassCleanup , TestInitialize,  TestCleanup , OrganizationServiceContext , RetrieveMultiple

 

 

 

 

 

Note on Decompilers

As a general note, you may have noticed that I frequently mention Telerik Decompile. I have no affiliation with the company other than having a friend or two that works there.  There are many decompilers on the market, RedGates Reflector being one of the most prominent. I was  a user of Reflector since it first came out and think RedGate is a great company and the current offering of Reflector is still a superb product.  I would recommend either product and use them both.  In any case, looking into the internals is what allows you to often cut through the clutter and figure out how things truly work.  I’d highly recommend you buy a license for one of these two products if you haven’t already.  I’ve also had a great deal of success with Salamander and like it a lot.

 

 

 

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: , , , , , , , , , , , , , , , , , , , , , ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 16 Apr 2014 @ 11:27 PM 

Unit Testing MSCRM 2011 – Part I of V

Unit Testing MSCRM 2011 is really no different than testing any other .NET application with the possible exception of Plugins and Workflows. However many shops completely overlook automated unit testing and that is very likely one contributing factor to why so many MSCRM projects fail or are much harder than they need to be.

Introduction:

One thing that has always bothered me about Microsoft Dynamics CRM development is how organizationally, it’s not treated like ‘real’ development.   What does that mean exactly? Well, it depends.  I’ve been a consultant at several companies. From very small ones (1-2 developers) to large ones (multiple teams with over 30+ developers and every formal role that goes along with it).  I can think of only 3 that had an actual API or Framework of truly reusable CRM code that projects were put together with (this isn’t the same thing as saying they didn’t reuse code just so I’m being clear).  I can think of only two that had automated unit testing. I can think of only 2 that had automated builds.  You get the idea hopefully.  You use C# or Visual Basic .NET for the most part to build CRM applications. You use ASP.NET, Silverlight, Windows Communication Foundation, Workflow Foundation to build out CRM Applications. You do most if not all of your development inside of Visual Studio .NET.  Yet in many cases, you could count more differences than similarities with respect to CRM projects and other ones.  So let this be an opening salvo in trying to enlist others to build CRM projects the same way they build other .NET Projects. Unit Testing MSCRM 2011 happens all too infrequently in most places if it happens at all, and as easy as it is to do, few things can aid your project’s quality more than Unit Testing MSCRM 2011.

MSTEST

There are several good tools to implement automated Unit Testing MSCRM 2011 with. Those include NUnit, MBUnit, MSTest , Telerik Test Studio and others. If you’d like to debate the merits of one framework over the other, there are plenty of places on the net that you’ll find many kindred spirits.  You won’t find one here.  What matters is that you automate your unit testing, any of the popular tools should suffice.  If you haven’t started unit testing, aren’t familiar with it or need to understand why it matters, you’d do well to do some background searching on the subject first. As far as I’m concerned, the debate as to whether or not you should unit test and how comprehensively you should do it was finished years ago.  You should definitely write automated unit tests. You should test early and often.  And you should test as much of your code base as you can.

[TestClass]

Visual Studio lets you create separate Unit Test Projects that serve as convenient containers to hold your tests. If you use another framework, the same principles apply.  Assuming that you’re using Visual Studio 2013 Professional (or an edition with Testing enabled), you’ll see a distinct Test Menu (Figure 1-1) that will allow you to interact with Test Explorer (Figure 1-2) among other things:

Figure 1-1Visual Studio 2013 Test Menu

MSTest Test Menu Options Visual Studio 2013

Test Menu Options in Visual Studio 2013

Figure 1-1Visual Studio 2013 Test Explorer

Microsoft Visual Studio 2013 Test Explorer

Visual Studio 2013 Test Explorer

Microsoft Visual Studio 2013 Test Explorer

Visual Studio 2013 Test Explorer

 

To start with, you should give your classes a coherently named Namespace.  YourCompanyName.ProjectName.Tests works as a starting point.  From there, you have 3 choices of items you can add to your Test Project:

  1. Basic Unit Test (An empty basic Unit Test Declaration)
  2. Ordered Test (Use an ordered test to execute a set of existing tests in an order you specify)
  3. Unit Test (An empty Unit Test class declaration)

The summaries at the end of each item pretty aptly describe what they do.  The important takeaway is that each set of test cases you want to run should be contained in a class definition that’s decorated with the [TestClass] attribute.

AssemblyInitialize & AssemblyCleanup

Lacking a better metaphor, you can think of AssemblyInitialize & AssemblyCleanup as outer layers of an onion.  MSDN describes AssemblyInitialize in the following way:

Identifies a method that contains code to be used before all tests in the assembly have run and to allocate resources obtained by the assembly.

AssemblyCleanup is the inverse counterpart, running at the end of everything to serve as the final cleanup agent.

Do you need these? Maybe, maybe not.  Generally speaking, a good unit test will allow itself to run in whatever environment the developers want it to.  To do that, it must generally be configurable.  Then it should create all the data it needs.  The tests should be run on this data validating the code along the way. Finally, the data that was created for the test should be removed so that the system is in the same state it was when the testing began.

At this point I’ll take a slight digression. I’ve heard (and participated in) many arguments around this subject. Many will argue that you can create effectively disposable CRM online instances so that cleaning up the data at the end isn’t really necessary. Personally, I don’t see much validity in this unless you’re coming from the “I don’t do anything more than the minimum I have to” perspective.  Others will argue you can use Virtual Machine images, load the data once there, save snapshots etc etc, thereby negating the need for adding data and cleaning it up. I think it’s a lame argument b/c it seems like it requires more resources and makes things less atomic, but devotees insist they can script things and do it in a way that’s faster than anything that can be done with setup and teardown.  Whatever.  The main point is that you want to test your code logic – you don’t want to get false positives or negatives b/c of a data issue. If you have a way that’s repeatable, can be run in multiple environments and doesn’t open itself up to data issues, go with my blessing.  Just keep in mind that if done correctly, your unit tests can (and should) validate that any given version of code will run on a given environment. If you do a promotion or deploy a new solution/web site /whatever, unit tests can be run against that environment to verify everything works as advertised.  This may or may not be advisable on a production system but it certainly is something you want to do everywhere else.  Creating the data, testing it and deleting it is one sure way to accomplish this goal.

Anyway, if you want to set up data that spans classes, AssemblyInitialize and AssemblyCleanup are the places to facilitate that.

 

ClassInitialize & ClassCleanup

ClassInitialize and ClassCleanup are the next layer of the onion. ClassInitalize runs when an instance of the class is created, and ClassCleanup is run after all the tests in the class have been run. The only thing noteworthy about these are that they are static, and ClassInitialize takes in an instance of the TestContext class.  The TestContext class is a container that lets you store things that you can reference throughout the tests.

The signature for Class initialize is shown below:

private static TestContext currentTestContext;

///<summary>

///Gets or sets the CurrentTestContext property.

///</summary>

publicstaticTestContext CurrentTestContext

{

      [DebuggerStepThrough()]

      get

{

        return currentTestContext;

}

      [DebuggerStepThrough()]

      set

{

          currentTestContext = value;

}

}

 

[ClassInitialize()]

publicstaticvoid ClassInitialize(TestContext context)

{}

Anyway, this is a great place to load data.  For instance, you may have data corresponding to each entity you want to create stored in an Excel sheet.    Here is a working example of code I used in production to load a list of Dealers from an Excel sheet.  The location is stored in a settings file so hopefully you can deduce that on your own.  Also keep  in mind that with excel, each Sheet name corresponds to a Table name, it just uses a Dollar Sign $ character at the end of it:
String ConnectionString = @Settings.Default.ExcelFileLocation;

DataTable dt = newDataTable();

OleDbConnection conn = newOleDbConnection(ConnectionString);

String SQL = String.Format(CultureInfo.CurrentCulture, “SELECT * FROM [{0}$]”, “Dealer”);

OleDbDataAdapter adapter = newOleDbDataAdapter(SQL, conn);

adapter.Fill(dt);

Afterward, I call a method called HydrateDealers where I just pass in the DataTable, loop through the rows and then create the corresponding entities.  I also create a collection of Guids corresponding to the newly created entity id values.  This is so that in the ClassCleanup method, I can just loop through the collection, using the entity type (in this case, dev_dealer) and the id and calling the OrganizationServiceInstance Delete method.  Exception handling only writes to the Console for this example, in practice I log it using the EnterpriseLibrary and using the TestResults feature, but the article is already getting too hard to read.

So using the ClassCleanup counterpart, this method runs for each of the collections that I populated earlier. Using the stored Guids (which I get from the OrganizationService.Create ) I can loop through each collection and delete anything I created. Done correctly, I leave the system in the exact same shape I got it with respect to data.  This means no “Test Accounts” or any garbage of that sort in the system:

[ClassCleanup]

public static void CrmCleanup()

{

foreach (Guid dealerId inFinancingUnitTest.CurrentDealersList)

{

try{

ServiceInstance.Delete(“dev_dealer”, dealerId);

}

catch (Exception ex){

Console.WriteLine(ex.ToString());

}

}

TestInitialize & TestCleanup

These run at the beginning and end of each test and let you set things up and tear things down that are specific to the test.  The same principle applies here that applies to the AssemblyInitialize , AssemblyCleanup , ClassInitialize and  ClassCleanup attributes, they just let you apply them at a more granular level.  The Assembly attributes are the outermost layer of the onion, the Test attributes are the innermost ones.

Now, one last thing before closing this portion of the article.  It is often useful to have a reference to the OrganizationService.  My preference is to create a property of Type CrmConnection and one of type OrganizationService (if you don’t know how to do that, there are several examples throughout this blog).  This particular approach lends itself well to the so-called ‘late bound’ approach to CRM coding.  But the same principal can be applied using the early bound approach (using the OrganizationServiceClient) or the OrganizationServiceContext.  The main point is that you have something to hook onto to run and test your methods.

With that in mind, the next article will walk through creating several tests using the [TestMethod] attribute

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: , , , , , , , ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 16 Apr 2014 @ 3:57 PM 

AssociateEntitiesRequest

AssociateEntitiesRequest is an implementation of the Request/Response pattern used throughout Microsoft Dynamics CRM 2011.  Any time you have entities related in a N-N fashion, you can use this request to programmatically bind them together. [This sample uses C# but can easily be ported to use any other .NET language including Visual Basic .NET]

In order to use this request, you’re going to have to add a reference to the  Microsoft.Crm.Sdk.Proxy.dll library (which is located in the bin directory of the SDK).  This request is a member of the Microsoft.Crm.Sdk.Messages namespace. Hence the fully qualified type name is Microsoft.Crm.Sdk.Messages.AssociateEntitiesRequest.  Because there is no Microsoft.Crm.Sdk.Messages dll, this is the source of confusion for many newcomers.  You’ll still likely need a reference to Microsoft.Xrm.Sdk, Microsoft.Xrm.Client (if you want to use the CrmConnection), System.Runtime.Serialization and System.ServiceModel if you want to use the OrganizationService and System.Configuration if you want to reference a value in the .config file.

Conceptually, think of a Contract. A Contract may have Terms & Conditions.  Terms & Conditions are generally applicable to states or jurisdictions. I’m not a lawyer so if I’m talking out of my a55, please ignore it, it’s just an example.  Assume you have a Contract entity, a T&C Entity and a State entity.  You want to associate a set of T&Cs with a state.  Just to make things easy, I’m going to declare two properties, one a CrmConnection and the other of type IOrganizationService.  The CrmConnection points to a connectionstring named CrmMain added in a config file .(I add the DebuggerStepThrough attribute on the Get accessors just for good measure b/c stepping through the accessor repeatedly can be a huge waste of time

IOrganizationService
public static IOrganizationService ServiceInstance
{
[DebuggerStepThrough()]
get
{
if (serviceInstance == null)
{
serviceInstance = new OrganizationService(ConnectionInstance);
}
return serviceInstance;
}
}

CrmConnection
public static CrmConnection ConnectionInstance
{
[DebuggerStepThrough()]
get
{
if (connectionInstance == null)
{
connectionInstance = new CrmConnection("CrmMain");
}
return connectionInstance;
}
}

With that in place, we’re ready to use the AssociateEntitiesRequest.  You’ll need to know three things.

  1. Name of the first Entity you want to associate using the AssociateEntitiesRequest. (used for the Moniker1 property of the request)
  2. Name of the Target entity you want to associate using the request.  (used for the Moniker2 property of the request)
  3. The name of the Relationship (the RelationshipName property).

If you have the names for Moniker1, Moniker2 and RelationshipName, ignore the next paragraph.

If you do not know any of the three items mentioned above, the following will help you identify them:

To find the name of the relationship. open the solution, find either of the Entities in question and choose  the N:N Relationship node. You’ll see a list of possible relationships on the right hand side. There shouldn’t be too many of these generally and it should strike out at you (b/c the “Other Entity” column should have the name of one of the two entities you’re looking to relate);

AssociateEntitiesRequest - Microsoft Dynamics CRM 2011

AssociateEntitiesRequest

With these items identified, the associating the items is absolutely trivial.

1- Declare and instantiate an instance of the AssociateEntitiesRequest. It only has the default constructor available so there’s nothing exotic about it:

AssociateEntitiesRequest AssociateRequest = new AssociateEntitiesRequest();
2- You need to set the Moniker1 and Moniker2 properties of the request.  These will be of type EntityReference.  To create an EntityReference, you simply need the entity’s identifier (the Guid assigned by CRM) and the CRM type name.  So for example, if you wanted to create an EntityReference to an Account entity with a Guid of  73BC8091-3B13-452A-AC0E-BE5EB18BF735, you’d do the following:

EntityReference AccountReference = new EntityReference("account", new Guid("73BC8091-3B13-452A-AC0E-BE5EB18BF735");
You’ll do the same for the Moniker1 and Moniker2 properties.
AssociateRequest.Moniker1 = new EntityReference("dev_terms", termId);
AssociateRequest.Moniker2 = new EntityReference("dev_state", stateId);

3- Set the  RelationshipName property (in this case, “dev_terms_state“).

AssociateRequest.RelationshipName = "dev_terms_state";

4- Call  the Execute methodsof the OrganizationService instance which we created at the beginning of the article. Pass in the AssociateEntitiesRequest instance we just created.

ServiceInstance.Execute(AssociateRequest);
I’ve left off correct exception handling and logging for the sake of readability, but it should go without saying that any production code should take exceptions into account.  If you did this correctly and didn’t encounter any errors, you’ll now see the items you specified in CRM associated together

If you have not already done so, please make sure you have downloaded the Microsoft Dynamics CRM 2011 Software Development Kit (SDK).  There are certain features only available in certain rollups, so make sure the version of the SDK you are using matches the rollup you are currently working with on your Dynamics CRM 2011 instance (or the rollup you plan to be using)

Keywords: AssociateEntitiesRequest, , Moniker1, Moniker2, RelationshipName, DebuggerStepThrough, DebuggerStepThroughAttribute, Microsoft.Crm.Sdk.Proxy, Micrososft.Crm.Sdk.Messages, EntityReference, Guid, MSCRM, Microsoft Dynamics CRM, Dynamics CRM 2011, Microsoft Dynamics CRM 2011 Software Development Kit, ConnectionString, CrmConnection.ConnectionString, Bill Ryan, William G. Ryan, OrganizationService, IOrganizationService, CrmConnection, AssociateEntitiesRequest

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: , , , , , , , , , ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 14 Apr 2014 @ 9:43 PM 

Dogecoin

Dogecoin is a big part of what I’m going to start talking about. I’ll also be covering several other issues as well, but DogeCoin will be the focus of things, from mining, to faucets to trading.

Artificial Intelligence, Bots, Forensics, Counter Measures

I’ve never been good at keeping my blog up to date, in large part b/c I constantly find new things interesting.  I’ve went dormant for a while and am going to start posting again regularly.  Truth be told, i’ve refrained from posting in large part b/c of BS associated with it.  You see, a blog post is just that, a blog post.  However it can serve as a Rorschach test.  People see what they want in it.  I’ve spent countless hours defending myself from delusional accusations that posts were about specific people.  The legal system isn’t what you see on law and order.  You can be very innocent and fortunately, if you have means, it’s often easy to prove that. But you can spend a lot of money in the process.

If you happen to get on the bad side of a law enforcement officer, things as innocuous as walking downstairs to your car can be framed as interfering with an investigation and you can be called to prove otherwise.  In any case, I got really sick of playing whack a mole with the colorful imagination of someone who flattered themselves to think everything I wrote revolved around them.

I’ve spent a lot of time helping victims of computer crimes. If you look at my articles  on “The Hacking of DB Singles.org”  or similar ones, I think it’s pretty clear where I stand.  I’ve helped countless people avoid victimization online as well as trying to mitigate the damage they’ve already incurred.  If I wanted to, I could post a list of over 30 people that would gladly spell out in detail how much I’ve helped them.  But begin a white knight isn’t my thing and running around posting about how great you are is well, pathetic. In any case, I write about computer crime and common scenarios people don’t think about.  I have 7 articles queued up from over a year I’ve hesitated to post b/c I knew certain parties would use it as ‘evidence’ that I was about to engage in such crimes.

A big part of what I write is using real world, tangible scenarios that make people understand how vulnerable they are.  To do this, while being a cyber-criminal myself, would be the height of stupidity. Here, let me write out a blueprint on how to commit a crime, and then let me go commit it. Only a fool would do such a thing.  I’m a lot of things, but being a fool isn’t one I’ve been accused of much.

The more real the scenarios, the more certain parties have claimed I’m talking about them.  Well, the time has come that I’ve had enough. I’m tired of withholding valuable information that could protect people b/c of fear that someone will delusionally misrepresent what I’ve written as some sort of manifesto about what i’m going to do.

Currently, i’ve had the pleasure of taking a job as Director of Data Science for  a very successful publicly traded company. It’s not work, it’s like getting paid to play.  In the course of that, I get to investigate and play with many different technologies. From here on in, the blog focus will be as follows.  My intent is to inform and protect. Anyone that takes it as someone else is seriously fooling themselves (and should that day arrive in court, I’m quite confident ANY and every reasonable person will see it for exactly what it is – benevolence shown by someone fascinated by technology, helping others avoid victimization:

Keywords:  CryptoCurrency, CryptographyDogecoinLiteCoin & BitCoin, Artificial Intelligence, Advanced Statistics and Calculus, Web Services and Web API’s, Natural Language Programming, Zero-Day exploits, Digital Forensics – Cellebrite in Particular, Protecting yourself from being victimized online., Anonymity,Tails, UDOO, Raspberry PI, BOTS, PYTHON

 

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Technorati Tags: , , , , , , , ,

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 31 Aug 2013 @ 8:28 PM 

Nootropics – Phenibut from Liftmode

I’ve been very interested in Nootropics for a while now. I frequently get questions about which ones I take, where they can be procured and what areas can be enhanced with them.

Phenibut is one of the more popular products in this area. There is some debate about whether or not it’s actually a Nootropic but in my experience, the Nootropic benefits are profound.  

If you search around for Phenibut, you’ll see it discussed pretty frequently in Bodybuilding forums.  It’s been used in Russia (and the former Soviet Union) for years and is prescribed by doctors there today.  I’ve used it for about 2 months and have grown quite fond of it.  Like anything, I’ll admit it’s possible that the benefits I experience are the result of the placebo effect but I strongly doubt it.

So what does it do?  Depending on the dosage, Phenibut’s effects range from energizing to sedating.  Personally, I started out taking 500mg once or twice a day and didn’t really notice any effect. In fact I really started to doubt the efficacy of it at first. Then I increased the dosage to 1500mg one day and found very energized and pleasant. I tried taking 2g’s a day, twice a day for the next week and it seemed to really give me a boost.  Just for experimentation, I would take 2gs before I went running and compared how I felt to times I’d go running without any Phenibut. Again, it could be mostly placebo going on, but it seemed to help tremendously. 

A slight digression on this note—I’ve had sleeping problems for years but the last 4 years started to be very problematic for me in terms of sleep.  Once I got heavy, the more weight I gained the worse my sleeping problems became.  Things got so bad for me that I would basically not sleep all week and then I’d crash all weekend.  i went from being a dyed in the wool ‘morning person’ to someone who had extreme difficulty waking up.  After waiting much too long, I had a sleep study performed, was diagnosed with Sleep Apnea and started CPAP therapy.  The CPAP fixed my sleep problem overnight and had a huge impact on my health, but b/c I had had ‘insomnia’ for so long, my sleep cycles were totally out of what. I would generally have trouble going to sleep before 5:30 AM – using the CPAP, I would sleep 6-8 hours a day, sleep like a rock and all would be great, but i still had a lot of trouble falling asleep.  I started trying to use a Phillips Go-LITE (which I’ll post a review about). The Go-LITE helped me wake up on time but I still had trouble falling asleep. Phenibut seemed to be the fix to the going to sleep problem i was still experiencing.

Phenibut gives you a notable energy boost starting after 5-10 minutes after first dose.  Even though I describe the effects as an energy boost, it’s much different from Caffeine or Ephedrine -mainly, it doesn’t raise my heart rate or blood pressure and there’s no ‘racey’ or nervous feeling associated with it. In fact it’s quiet the opposite – it always seems to reduce anxiety, relax you a good bit and gives you  a general feeling I can only describe as well-being.

I read about reports of increased tolerance and withdrawal (which many reports say is terribly bad) which I found concerning. I started taking it only once or twice a week, then started taking it more frequently. There have been weeks in the recent past where i”ve taken it every single day for a week. Generally though, I keep use limited to 4-5 days a week.  The most I’ve taken in a day is 5gs although that’s pretty high, i try to keep it to 2 dosages a day of 1.5-2gs.  Taking 4gs a day for 4 weeks straight, I tried to stop using it completely to see if tolerance was building or if there were any withdrawal effects. I don’t want to tell anyone there is no withdrawal (our bio-chemistry is very unique and one person may not experience any withdrawal while someone else might have the opposite reaction), but in my case, even after daily usage for 2 months, I was able to take a 2 week break without any  notable withdrawal effects.  At worst I may have felt a little edgy once or twice, but i’d be hard pressed to say it was actual withdrawal and not just general tiredness or agitation that happens here and there in life. 

Suffice to say that I’m very glad I found Phenibut and I really like it. I take it before running and it seems to give me a very comfortable boost. I use it before work frequently and it seems to give me a very comfortable edge helping me stay alert, helping me to think clearly and focused and gives me all of this without any downside.

So where do you find it? I have tried several brands and types and have settled on Liftmode.  I’ve bought several products from LiftMode and think they are about as good to deal with as a company can be. First off, their product line is extensive . they have many different sizes for most of their products, they have everything from small sizes of Phenibut (40gs for $12.99) to really big containers of it (500G for $98.88).  When you buy something from LiftMode, they give you a certificate of Analysis from a laboratory that confirms the purity of their products.  In this case, the Phenibut was analyzed by Analytical Labs in Anaheim.  It shows a claim of > 99.5% and a result of 100%. LiftMode is great to deal with from every regard that I’ve seen, and they ship the product the same day in most cases.  Delivery is quick and reliable and they’ll send you tracking information if you choose a shipping option that provides tracking updates.  Having tried several brands of Phenibut (including many that are much more expensive), I’ve found that Liftmode’s Phenibut is consistently great and the most reasonably priced stuff I’ve come across. They are very responsive and friendly to deal wit and doing business with them couldn’t be easier.

As an aside, if you’re going to give Phenibut a try (again, keep in mind that LiftMode offers a 40g container for $12.99), check out LiftMode’s L-Theanine and their Noopept. L-Theanine is reported to help bolster the efficacy of Phenibut and in my experience, they do seem to compliment each other well. Noopept is effective at really small dosages (dosages so small it’s hard to believe they work – but countless things are very effective at seemingly small dosages). I’ll write more about L-Theanine and NooPept in another post, but if you’re interested, take a look at LiftMode’s sight and look around. Their stuff is very reasonably priced and you can try a Nootropic buffet including a sampling of almost everything they sell without breaking the $150.00 mark. I’ve tried just about everything they have and have been consistently pleased with everything I’ve gotten from them, I’d encourage you to give them a try – I’d love to hear any feedback on any Nootropics you’ve tried and let me know if your experience was similar.

 

[Liftmode, Phenibut, L-Theanine, GhB, NooTropics, Noopept]

Donate Dogecoins: DFxAsJEQenZvj8W8BcvMmMpZVp8DhFUr4a Whats This?

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • Digg
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
Posted By: admin
Last Edit: 31 Aug 2013 @ 09:38 PM

EmailPermalinkComments Off
Tags
Categories: CRM 2011 Development





 Last 50 Posts
 Back
 Back
Change Theme...
  • Users » 11643
  • Posts/Pages » 125
  • Comments » 415
Change Theme...
  • VoidVoid « Default
  • LifeLife
  • EarthEarth
  • WindWind
  • WaterWater
  • FireFire
  • LightLight

Code Sample Reference



    No Child Pages.

Disclaimer



    No Child Pages.