Since this is clearly homework I am not going to throw the answers out there, but I will give you some tips for approaching them.
For your first problem, you need to think back to the definition of a logarithm. What would the answer be if it were this?
2^(log base2 (n))
If I plug four in for n, the log would evaluate to two. The power would then evaluate to four. If I plug 8 in, I get 8. In other words, any base b raised to the log base b of any value n gives n. The exponentiation and the log cancel each other out with the same base.
What happens if the exponentiation has a larger base than the logarithm? Smaller base? How fast do polynomials grow compared to exponentiation and logs regardless of base?
For the second problem, what happens when you apply any exponentiation or factorial to a function that is polynomial or lower? What dominates in the long run, assuming you are not doing 2^(log2(n)) or something else that cancels out?
Remember, with Big-Oh you need to think about arbitrarily large trends. Not specific numbers, just think if you looked down the n axis trillions of values out, what is the dominating factor? I could plot 1.00000000000000000000000001n and it will stay very small for a long time but eventually it will grow faster than n10000000000000000. Granted in practice the first is preferable to the second, but we are talking theory here, not production-quality algorithms.